The Greek philosophers from two thousand years ago had an idea that the atmosphere did not extend all the way into space. "Shooting stars" -- they called them "meteors" from a word that meant high-up -- were thought to be in the atmosphere (which happens to be true) while the stars and planets were thought to be beyond the atmosphere.
They did not have any proof (not what we would call "proof" in modern science) but they were quite confident.
When the barometer was invented (1640s), it was quickly shown that the atmospheric pressure went down as the observer went up in altitude. It was somewhat easy to calculate that the atmospheric pressure would drop to zero (the start of space) at some altitude above sea level. This altitude was insignificant compare to the estimated distance to the Moon (they already knew the Earth's atmosphere did not go all the way to the Moon).
The more accurate limits were calculated in the 18th century (using Newton's calculus and more precise measurements of how fast the atmospheric pressure decreased with altitude). 18th century = the 1700s
When it became possible to take radar distances to meteor, it was possible to establish the extent of Earth's atmosphere quite accurately (before the space program).
When Sergei Koroliof launched the first Sputnik (1957), he already knew how far the atmosphere extended, and how much air was still up there, at the altitude of Sputnik 1 (he knew it would eventually fall back due to air resistance). It was 0.01% of what it was at Earth's surface.
At the altitude of the International space station (400 km = 250 US miles), there is just enough air left to slow down the space station, so that it needs to be boosted back up every once in a while.