About
Helping the world be heard and get discovered at Protégé.
Previously - digitized…
Previously - digitized…
Activity
5K followers
Michael Cruz replied to a comment
2y
Not a troll. We’ve been exploring this stuff too - better to learn together. I joined!
Michael Cruz replied to a comment
2y
Ian Sherwood love what you’re doing! Can I get access to this telegram group?!
Michael Cruz replied to a comment
2y
Ian Sherwood in this situation, the opposite of a fact isn’t a lie necessarily. This is a hallucination as there’s no deliberate intent to deceive on the LLMs part. This is probabilistic auto completion based on a specific training set. To reduce hallucination you can try some prompt engineering tactics - chain of thought or self evaluation - or leveraging a vector store of known truths to pull from?
Experience & Education
Recommendations received
-
LinkedIn User
2 people have recommended Michael
Join now to viewOther similar profiles
Explore top content on LinkedIn
Find curated posts and insights for relevant topics all in one place.
View top content