Information, according to the mathematical theory that bears its name, reduces uncertainty. If, for example, I tell you I tossed a coin twice, you’ll know there were four equally probable outcomes. But if I then tell you the first toss came up tails, the number of possible outcomes cuts in half: tails/heads or tails/tails.
In this way, the information I have given you has cut your uncertainty in half. Everything we do in IT starts here, with the definition of a “bit.”