Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
'You are a helpful mail assistant,' and other Apple Intelligence instructions
(
theverge.com
)
9 points
by
minimaxir
on Aug 5, 2024
|
hide
|
past
|
favorite
|
3 comments
minimaxir
on Aug 5, 2024
[–]
A "do not hallucinate" prompt engineering instruction is certainly a first since it implies the reason LLMs hallucinate is just for fun.
teamonkey
on Aug 5, 2024
|
parent
[–]
I don’t understand how that would even work. Can a LLM even know the difference between a hallucination and an (for want of a better word) inference?
minimaxir
on Aug 5, 2024
|
root
|
parent
[–]
No, it's pointless and likely doesn't affect the output.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: