Remote Sensing and Satellite-Based Earth Observation

Satellites don't just take pretty pictures of Earth — they measure it, continuously and systematically, in ways that ground-based instruments simply cannot replicate at scale. This page covers how remote sensing works, what drives its design tradeoffs, how different sensor and orbit types are classified, and where the common misconceptions tend to cluster. The applications span environmental science and earth systems, disaster response, agricultural monitoring, and climate research — making remote sensing one of the most consequential toolsets in modern earth observation.


Definition and scope

Remote sensing is the acquisition of information about an object or surface without physical contact — typically by detecting electromagnetic radiation that the surface emits, reflects, or scatters. When the platform carrying the sensor is a satellite, the result is a global, repeatable record of Earth's surface and atmosphere that no field crew could replicate.

The US Geological Survey (USGS) defines remote sensing as "the science and art of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object." That definition, dry as it reads, covers an enormous range: a thermal infrared sensor mapping sea surface temperature is remote sensing; so is a radar instrument measuring ice sheet elevation in Greenland.

The scope of satellite-based earth observation extends across the entire electromagnetic spectrum, from ultraviolet through visible light, near-infrared, shortwave infrared, thermal infrared, and microwave bands. Each band reveals something different about the surface below — and not everything is visible to the naked eye, which is rather the point. The NASA Landsat program, the longest-running satellite earth observation program, has collected continuous imagery since 1972, producing a 50-year archive that is now freely available through the USGS EarthExplorer portal.


Core mechanics or structure

Every remote sensing system has three functional components: a source of electromagnetic energy, the surface or atmosphere that interacts with that energy, and a sensor that records what returns or what is emitted.

Passive sensors detect naturally occurring energy — typically sunlight reflected from Earth's surface, or thermal radiation emitted by the surface itself. These sensors depend on available illumination, which means they cannot collect optical data through clouds or at night (with the exception of thermal bands, which work day and night).

Active sensors generate their own energy pulse, transmit it toward the target, and record the return signal. Synthetic Aperture Radar (SAR) and LiDAR are the primary examples. Because SAR transmits its own microwave pulses, it penetrates cloud cover and operates regardless of solar illumination — a critical advantage for monitoring flood events, tropical deforestation, or Arctic ice, where clouds are persistent and solar angles are extreme.

The raw data a satellite sensor captures is typically expressed as Digital Number (DN) values, which must be converted to physically meaningful units — reflectance, radiance, or backscatter — through a calibration process. This conversion is non-trivial: atmospheric effects scatter and absorb radiation between the surface and the sensor, introducing errors that atmospheric correction algorithms must remove before analysis.

Spatial resolution, spectral resolution, temporal resolution, and radiometric resolution are the four fundamental image quality parameters. A sensor with 30-meter spatial resolution records one reflectance value per 30×30-meter ground cell. Spectral resolution refers to how many and how narrow the wavelength bands are — hyperspectral sensors can capture more than 200 contiguous bands. These four parameters are in constant tension with one another, as discussed in the tradeoffs section below.


Causal relationships or drivers

The capabilities of satellite remote sensing are driven by physics, orbital mechanics, and sensor engineering — not arbitrary design decisions.

Orbital altitude determines both coverage and resolution. Low Earth Orbit (LEO) satellites, operating between roughly 160 and 2,000 kilometers altitude, achieve finer spatial resolution because they are closer to the surface. The tradeoff is revisit time: a LEO satellite in a polar orbit typically passes over a given location once every 16 days unless a constellation of satellites shortens that interval. The Sentinel-2 constellation operated by the European Space Agency (ESA), for example, achieves a 5-day revisit at 10-meter resolution by using two satellites in the same orbital plane.

Geostationary orbit (approximately 35,786 kilometers altitude) allows a satellite to remain fixed over one point on Earth — which is why weather satellites like NOAA's GOES-18 can stream images of the Western Hemisphere every 10 minutes. The physics of that altitude produce coarse spatial resolution: GOES-18 achieves 500 meters at best, and 2 kilometers for most spectral bands, per NOAA's GOES-R Series documentation.

The wavelength a sensor uses determines what physical properties it is sensitive to. Chlorophyll absorbs red light and reflects near-infrared strongly — which is why the Normalized Difference Vegetation Index (NDVI), calculated from just two spectral bands, has become a global standard for monitoring vegetation health (NASA Applied Sciences). Microwave wavelengths between roughly 1 and 30 centimeters interact with water content and surface roughness, making them essential for soil science and pedology applications and for tracking soil moisture globally.


Classification boundaries

Remote sensing systems are classified along four primary axes:

  1. Platform type: Satellite, airborne (aircraft or UAV), ground-based, or spaceborne (not in orbit, such as the International Space Station).
  2. Sensor type: Passive optical, passive thermal, active radar (SAR), active LiDAR, active sonar (not satellite-applicable in the usual sense).
  3. Spectral domain: Multispectral (typically 3–15 bands), hyperspectral (greater than 100 bands), panchromatic (single broadband), or SAR (microwave).
  4. Orbit type: Low Earth Orbit polar/sun-synchronous, geostationary, medium Earth orbit, or highly elliptical orbit (used for high-latitude coverage).

Sun-synchronous orbit deserves specific mention. Most land observation satellites operate in this configuration, where the orbital plane maintains a constant angle relative to the sun. This means the satellite crosses any given latitude at approximately the same local solar time on every pass — ensuring consistent solar illumination across a multi-year archive. Without that consistency, comparing imagery from different dates would require correcting for wildly different sun angles.

The distinction between multispectral and hyperspectral is not just technical — it changes what questions can be answered. Multispectral imagery from Landsat's 11-band OLI-2 sensor is sufficient to map land cover classes, detect burn scars, or monitor glaciology and ice science. Hyperspectral imagery from sensors like NASA's AVIRIS can identify specific mineral compositions or vegetation species because the fine spectral detail captures absorption features invisible to broader band sensors.


Tradeoffs and tensions

This is where satellite remote sensing becomes genuinely contested — not politically, but physically. The four resolution types (spatial, spectral, temporal, radiometric) are governed by engineering constraints that make improvement in one dimension typically come at a cost to another.

A sensor with very high spatial resolution (sub-meter commercial satellites like Maxar's WorldView series, which achieve 30-centimeter panchromatic resolution) collects fewer photons per sample, reducing radiometric sensitivity and typically limiting spectral range. The sensor also covers a smaller swath width, which means it takes longer to build up a global mosaic.

Hyperspectral sensors require more data storage, downlink bandwidth, and processing capacity per scene than multispectral sensors — which has historically constrained their deployment to airborne instruments or short science missions rather than operational satellite constellations.

Cloud cover is the persistent nemesis of passive optical systems. Globally, roughly 66 percent of Earth's surface is covered by cloud at any given moment (NASA/CNES CALIPSO mission data), which means optical satellites in tropical or maritime climates may have very limited cloud-free observations in a given month. SAR resolves this, but SAR data is more complex to interpret and requires specialized processing expertise.

There is also a growing tension between commercial satellite data and the open-access scientific record. Commercial operators like Planet Labs operate constellations that deliver daily 3–5 meter imagery globally — but that data is not freely available. The USGS Landsat and ESA Sentinel archives, freely distributed, have enabled an entire generation of global-scale research; replicating that openness in commercial constellations has no clear resolution path.


Common misconceptions

Misconception: Satellites see through clouds with any sensor.
Only radar (SAR) and some microwave instruments penetrate cloud cover. Optical and thermal sensors, including the highly publicized high-resolution commercial imagery, are blocked by cloud as effectively as a camera in fog.

Misconception: Higher spatial resolution is always better.
Resolution is only useful relative to the feature of interest. Mapping global deforestation at continental scale with 30-meter Landsat data is scientifically rigorous and computationally tractable. Using sub-meter imagery for the same task would generate petabytes of data with no analytical advantage for coarse feature classes.

Misconception: Remote sensing data is directly interpretable.
Raw satellite data requires radiometric calibration, atmospheric correction, geometric rectification, and often additional processing before it represents a physically meaningful measurement. The "image" as seen in outreach materials has typically passed through a dozen processing steps.

Misconception: GPS is remote sensing.
The Global Positioning System determines the receiver's location — it doesn't observe the Earth's surface. GPS is navigation, not observation, though GPS signals are sometimes used in remote sensing applications (GPS radio occultation for atmospheric profiling is a legitimate remote sensing technique).

Misconception: Satellite images are taken instantaneously, like a photograph.
Many sensors build an image by scanning a swath as the satellite moves along its orbit, assembling a scene from many individual line acquisitions over seconds to minutes. Geometric distortions from this process are one of the corrections that preprocessing pipelines must address.


Checklist or steps

The standard workflow for extracting analysis-ready data from a satellite scene follows a defined sequence. The steps apply regardless of sensor type, though specific tools and parameters vary.

Standard satellite data processing sequence:

  1. Data acquisition and ingestion — Obtain raw Level-0 or Level-1 data from the archive (USGS EarthExplorer, ESA Copernicus Open Access Hub, or equivalent).
  2. Radiometric calibration — Convert DN values to radiance using sensor-specific calibration coefficients provided in the metadata.
  3. Atmospheric correction — Remove the effects of atmospheric scattering and absorption to retrieve surface reflectance. Tools include the USGS's LaSRC algorithm for Landsat and the Sen2Cor processor for Sentinel-2.
  4. Geometric correction and orthorectification — Remove positional errors caused by terrain relief and satellite viewing geometry, referencing a digital elevation model.
  5. Cloud and cloud-shadow masking — Flag and exclude pixels affected by cloud cover or shadow using automated mask layers (e.g., Landsat's CFMask or Sentinel-2's Scene Classification Layer).
  6. Mosaicking or temporal compositing — Combine multiple scenes to build cloud-free coverage over a target area or time period.
  7. Index calculation or classification — Compute spectral indices (NDVI, NDWI, NBR, etc.) or apply land cover classification algorithms.
  8. Accuracy assessment — Validate classification outputs against independent reference data to quantify producer's and user's accuracy.
  9. Data dissemination — Export, publish, or archive analysis-ready outputs in standardized formats (GeoTIFF, NetCDF, COG).

This sequence is codified in USGS's Landsat Collection 2 processing documentation and ESA's Sentinel-2 processing baseline definitions.


Reference table or matrix

Parameter Landsat 9 (OLI-2/TIRS-2) Sentinel-2A/2B GOES-18 (ABI) Sentinel-1 (SAR)
Operator USGS / NASA ESA NOAA ESA
Orbit type Sun-synchronous LEO Sun-synchronous LEO Geostationary Sun-synchronous LEO
Altitude ~705 km ~786 km ~35,786 km ~693 km
Spatial resolution 30 m (multispectral), 15 m (pan), 100 m (thermal) 10 m / 20 m / 60 m (band-dependent) 500 m – 2 km 5 × 20 m (IW mode)
Spectral bands 11 13 16 C-band SAR (5.405 GHz)
Revisit time 16 days 5 days (2-satellite) Continuous (~10 min) 6–12 days
Cloud penetration No No No Yes
Data access Free (USGS EarthExplorer) Free (Copernicus Hub) Free (NOAA CLASS) Free (Copernicus Hub)
Primary use Land cover, change detection Agriculture, land, vegetation Weather, severe storms Flood mapping, ice, deforestation

The broader landscape of earth science tools and technologies — from ground-penetrating radar to seismograph networks — shares the same underlying logic as remote sensing: translate a physical phenomenon into a measurable signal, then work backwards from the signal to the phenomenon. Remote sensing simply does this from 700 kilometers up, which creates both its extraordinary reach and its characteristic limitations.

For context on how satellite observation integrates with field-based methods, the fieldwork and data collection section provides complementary detail. The full scope of disciplines that rely on remote sensing data — from meteorology and atmospheric science to climate science and climatology — is mapped across the earthscienceauthority.com topic framework.


References