Many people think that fiction books are just made up stories, but that's not entirely true. A lot of these tales are based on real science and facts! Authors often spend tons of time digging into the world around us, exploring science and human life before putting pen to paper. I totally get that we should pay more attention to fiction books. Maybe we could even start including them in school lessons – how cool would that be?