GIGO is an acronym, meaning "garbage in, garbage out". It was the first thing taught to me when I started using computers back in the middle of the last century.
It is the first law of computing. A computer is, after all, only an adding machine. if you input incorrect data, you get a wrong answer. As the meerkat says- simples.
The recent bad weather and the global warming fiasco has hammered the point home.
This is the BBC's description of the Met Office's new supercomputer
"The computer is about the size of two football pitches and can make about 750 trillion calculations a second - equivalent to 100,000 PCs."
Yes but- if you feed duff data in, you'll always get duff answers. Remember the first rule?
All these computer models are doubly or even triply flawed.
1. If you devise a computer model according to your specifications, the results will be in line with your model. So, if you believe that global warming is happening, you will devise a model to reflect your views/beliefs, and no matter what data you feed in, the answer will come out the way you programmed it.
2. If you filter the data to take out any anomalies before inputting it, the result will be flawed.
3. If you then publish your results (in the form of a weather forecast) and it's wrong, people will get upset.
And it doesn't matter how big your computer is. Garbage in, garbage out, remember?