toot5330 toot5330 21-03-2024 Social Studies contestada What could be a reason behind hallucinations in LLMs, as discussed in the information provided? a) Limited data availability b) Narrow training on specific domains c) Overemphasis on coherence over creativity d) Noisy and inconsistent training data