The L in "LLM" Stands for Lying
Summary
The article argues that LLMs enable forgery and require robust source attribution to distinguish authentic output from imitation. It critiques hype, examines implications for software development, art, and open source, and advocates for transparent provenance and citation to curb AI misinformation.