Do LLMs estimate uncertainty well in instruction-following?
SimpleStrat: Diversifying Language Model Generation with Stratification
Me, Myself, and AI: The Situational Awareness Dataset (SAD) for LLMs
Be like a Goldfish, Don’t Memorize! Mitigating Memorization in Generative LLMs
Superposed Decoding: Multiple Generations from a Single Autoregressive Inference Pass
Probabilistic Inference in Language Models via Twisted Sequential Monte Carlo
Quiet-STaR: Language Models Can Teach Themselves to Think Before Speaking
Monitoring AI-Modified Content at Scale: A Case Study on the Impact of ChatGPT on AI Conference Peer Reviews
The Non-Effect of Sampling Temperature on Problem Solving in GPT-3.5/GPT-4
Blending Is All You Need: Cheaper, Better Alternative to Trillion-Parameters LLM
Universal Self-Consistency for Large Language Model Generation
SEDD: Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution
Let Models Speak Ciphers: Multiagent Debate through Embeddings
Contrastive Decoding Improves Reasoning in Large Language Models
Accelerating LLM Inference with Staged Speculative Decoding
Sequential Monte Carlo Steering of Large Language Models using Probabilistic Programs
MUX-PLMs: Pre-training Language Models with Data Multiplexing
Characterizing Attribution and Fluency Tradeoffs for Retrieval-Augmented Large Language Models
Witscript 3: A Hybrid AI System for Improvising Jokes in a Conversation
A survey on text generation using generative adversarial networks
Contrastive Decoding: Open-ended Text Generation as Optimization
Help me write a poem: Instruction Tuning as a Vehicle for Collaborative Poetry Writing (CoPoet)
Contrastive Search Is What You Need For Neural Text Generation
Arithmetic Sampling: Parallel Diverse Decoding for Large Language Models
Most Language Models can be Poets too: An AI Writing Assistant and Constrained Text Generation Studio
Ask Me Anything (AMA): A simple strategy for prompting language models
Out of One, Many: Using Language Models to Simulate Human Samples
Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned
DIRECTOR: Generator-Classifiers For Supervised Language Modeling
RankGen: Improving Text Generation with Large Ranking Models
Controllable Natural Language Generation with Contrastive Prefixes
FIGARO: Generating Symbolic Music with Fine-Grained Artistic Control
A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models
NeuroLogic A✱esque Decoding: Constrained Text Generation with Lookahead Heuristics
Controllable Generation from Pre-trained Language Models via Inverse Prompting
Improving Diversity of Neural Text Generation via Inverse Probability Weighting
There Once Was a Really Bad Poet, It Was Automated but You Didn’t Know It
A✱ Search Without Expansions: Learning Heuristic Functions with Deep Q-Networks
MAUVE: Measuring the Gap Between Neural Text and Human Text using Divergence Frontiers
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Collaborative Storytelling with Large-scale Neural Language Models
NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints
Interacting with GPT-2 to Generate Controlled and Believable Musical Sequences in ABC Notation
MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models
A Systematic Characterization of Sampling Algorithms for Open-ended Language Generation
Mirostat: A Neural Text Decoding Algorithm that Directly Controls Perplexity
true_poetry: Poetry generator by GPT-2 with meter and rhyme constraints
Trading Off Diversity and Quality in Natural Language Generation
Rapformer: Conditional Rap Lyrics Generation with Denoising Autoencoders
Top-K Training of GANs: Improving GAN Performance by Throwing Away Bad Samples
Controlling Text Generation with Plug and Play Language Models
Plug and Play Language Models: A Simple Approach to Controlled Text Generation
CTRL: A Conditional Transformer Language Model For Controllable Generation
Good News, Everyone! Context driven entity-aware captioning for news images
Insertion Transformer: Flexible Sequence Generation via Insertion Operations
Blockwise Parallel Decoding for Deep Autoregressive Models
OCD: Optimal Completion Distillation for Sequence Learning
Controlling Linguistic Style Aspects in Neural Language Generation
Language Generation with Recurrent Generative Adversarial Networks without Pre-training
Improving Neural Machine Translation with Conditional Sequence Generative Adversarial Nets
Tuning Recurrent Neural Networks with Reinforcement Learning
Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
Generative Concatenative Nets Jointly Learn to Write and Classify Reviews
Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks
Feature: Beam Search for Improving Global Quality of New Text Samples
Exclude Top Choices (XTC): A Sampler That Boosts Creativity, Breaks Writing Clichés, and Inhibits Non-Verbatim Repetition
58c4b4dfce0d50d482f1c456ed5b526fcccde0df.html#issue-2471950553
Pixels Still Beat Text: Attacking the OpenAI CLIP Model With Text Patches and Adversarial Pixel Perturbations
Me, Myself, and AI: the Situational Awareness Dataset (SAD) for LLMs
Apple or IPod? Easy Fix for Adversarial Textual Attacks on OpenAI's CLIP Model!
2022-arora-figure4-amapromptgenerationscalingvskshotwithmodelsize.jpg
2020-roller-facebook-blenderchatbot-ratedperformancevshumans.jpg
https://chat.openai.com/share/04add58f-2052-4b60-ae2a-ab708c29088f
https://datajenius.com/2022/02/12/the-effect-of-various-text-generation-methods-on-the-outputs-of-gpt-2/
https://homepages.inf.ed.ac.uk/abmayne/publications/sennrich2016NAACL.pdf
https://mi.eng.cam.ac.uk/projects/cued-rnnlm/papers/Interspeech15.pdf
https://openai.com/index/introducing-structured-outputs-in-the-api/#_5PYjnV1iAHOPKPupDztdZk
https://workshop2015.iwslt.org/downloads/IWSLT_2015_RP_13.pdf
https://www.lesswrong.com/posts/4Hnso8NMAeeYs8Cta/revealing-intentionality-in-language-models-through-adavae#BigVAE_and_Its_Samplers
Me, Myself, and AI: The Situational Awareness Dataset (SAD) for LLMs
Superposed Decoding: Multiple Generations from a Single Autoregressive Inference Pass
https%253A%252F%252Flink.springer.com%252Farticle%252F10.1007%252Fs10506-024-09396-9.html
Quiet-STaR: Language Models Can Teach Themselves to Think Before Speaking
Contrastive Decoding Improves Reasoning in Large Language Models
https%253A%252F%252Farxiv.org%252Fabs%252F2309.09117%2523facebook.html
https%253A%252F%252Farxiv.org%252Fabs%252F2306.17806%2523eleutherai.html
Sequential Monte Carlo Steering of Large Language Models using Probabilistic Programs
MUX-PLMs: Pre-training Language Models with Data Multiplexing
A survey on text generation using generative adversarial networks
Contrastive Decoding: Open-ended Text Generation as Optimization
Help me write a poem: Instruction Tuning as a Vehicle for Collaborative Poetry Writing (CoPoet)
Contrastive Search Is What You Need For Neural Text Generation
Arithmetic Sampling: Parallel Diverse Decoding for Large Language Models
https%253A%252F%252Farxiv.org%252Fabs%252F2210.15458%2523google.html
Most Language Models can be Poets too: An AI Writing Assistant and Constrained Text Generation Studio
https%253A%252F%252Faclanthology.org%252F2022.cai-1.2.pdf.html
Ask Me Anything (AMA): A simple strategy for prompting language models
Red Teaming Language Models to Reduce Harms: Methods, Scaling Behaviors, and Lessons Learned
https%253A%252F%252Fwww.anthropic.com%252Fred_teaming.pdf.html
https%253A%252F%252Farxiv.org%252Fabs%252F2202.11822%2523google.html
https%253A%252F%252Farxiv.org%252Fabs%252F2107.01294%2523allen.html
Prefix-Tuning: Optimizing Continuous Prompts for Generation
https%253A%252F%252Faclanthology.org%252F2021.naacl-main.235.pdf%2523facebook.html
https%253A%252F%252Fwww.thisworddoesnotexist.com%252F.html
https%253A%252F%252Fai.meta.com%252Fblog%252Fstate-of-the-art-open-source-chatbot%252F.html
Rapformer: Conditional Rap Lyrics Generation with Denoising Autoencoders
Controlling Text Generation with Plug and Play Language Models
https%253A%252F%252Fwww.uber.com%252Fblog%252Fpplm%252F.html
CTRL: A Conditional Transformer Language Model For Controllable Generation
https%253A%252F%252Farxiv.org%252Fabs%252F1909.05858%2523salesforce.html
Wikipedia Bibliography: