The biggest buzz in ground-based astronomy these days is the soon to be completed Rubin Observatory and its forthcoming wide field Large Synoptic Sky Survey. From a lonely mountaintop in northern Chile’s Atacama Desert, the observatory’s 8.4-meter optical telescope will scan the southern sky roughly every three to four nights.

In the process, over a decade of its observations will generate an unprecedented amount of raw data, much of it related to so-called transient astronomical events. Such events are usually active over brief periods of days or weeks and can involve high-energetic and destructive astrophysical events such as supernovae or gamma ray bursts. In fact, the LSST survey is expected to generate so much data that it will require a level of scientific data management that uses software and technology that will border on artificial intelligence.

The telescope’s repeated scans of its 9.6 square degree field of view (about the size of 40 full moons) will use a 3.2 gigapixel camera to create a nightly plethora of some 10 million astronomical alerts. In astronomical parlance, an alert can be triggered when a celestial object changes its brightness and/or position on the sky over short time scales.

But within 60 seconds of hitting the telescope’s primary mirror, these event’s photons will be transferred via high-speed optical relay into massive amounts of cloud storage. From there, this raw data will be processed and sent out to astronomers worldwide by so-called alert brokers.

An alert broker is an intermediary between the survey telescope, your observational science data, and follow-up telescopes, Francisco Forster, an astrophysicist at the University of Chile, told me in his office in Santiago. Because of the number of alerts expected with the LSST, you need to have special groups that have the capacity to ingest the alert stream and then do something with it, he says.

At the ‘Cosmic Streams in the Era of Rubin’ conference held last month in Puerto Varas, Chile, an international group of astronomers gathered to discuss exactly how the data that Rubin generates can best be processed. Once the telescope begins routine science operations in 2025, its alerts will be followed up by other observatories in virtually real time.

Most follow-up observations of these alerts will use spectroscopy —- the study of an object’s electromagnetic spectra —- to further measure and characterize the celestial target that produced it. But it’s also possible to observe the events that precipitated the alerts in multiple electromagnetic wavelengths. In some cases, this could even include the new field of gravitational wave astronomy.

The LSST Needs Advanced Algorithms

We need algorithms that can scale up to LSST data streams, Patrick David Aleo, a doctoral candidate in astronomy at the University of Illinois, Urbana Champaign, told me via email. We need algorithms that find celestial anomalies, he says. With the LSST, we expect to find objects which we didn’t even know existed, says Aleo.

And even though the telescope will not use thinking artificial intelligence in the classic sense of machine thinking, it’s clear that the future of astronomy lies in A.I. The amount of data that future telescopes will produce will demand an A.I. capability to enable astronomers to analyze raw data with speeds and accuracies that heretofore would be seen as science fiction.

But if we are going to apply machine learning, it must be super-fast, says Forster, the conference’s primary organizer. You cannot wait more than one second per object to classify the object, he says.

But there is still some cultural resistance in the astronomical community about handing over complete control of the analysis to computer software.

There can be an issue of trust, Matthew Graham, a research professor in astronomy at Caltech, told me in Puerto Varas. He wonders how much of our discovery process should we automate and give over to computers? We know machines can make mistakes, particularly as a result of human error if they haven’t been programmed completely correctly, he says.

The bottom line is that having humans in the loop as a safety check, can sometimes be important.

As for all the missed follow-up observational opportunities?

Even though timely follow-up observations on certain alerts may be technically impossible, there is a silver lining.

The LSST will produce a dataset that we will continue to sift through for many years after the shutter is closed, and A.I. Advancements will help us continue to design new ways of sifting, Alexander Gagliano, a postdoctoral research fellow with the NSF-funded Institute for A.I. and Fundamental Interactions at MIT, told me by email.

As For The Science?

One of the things that results from scanning the entire southern sky every three to four nights is the ability to find truly exceptional phenomena, says Gagliano. If one alert out of a million comes from something we’ve never seen before, then you can only make a groundbreaking discovery after collecting one million alerts, he says. The game is in quickly finding ways to pluck these rare events from the more common ones, says Gagliano.

As For The Future Of Astronomy?

Ten years ago, I predicted that by the 2020s, you would wake up and ask your smart assistant what had been detected the night before, says Graham. Then you’ll go, ‘oh, great, let’s figure out what we can do with it,’ he says.

But even then, will human eyes still need to interpret astronomical data?

I can’t look at a supernova light curve and estimate the radius of the star from which the explosion came, says Gagliano. Yet the predictions of algorithms remain strongly dependent on the data that they’ve been shown, he says. By contrast, humans have an innate talent for generalizing to entirely new situations; you can enter a room with a lamp you’ve never seen before and still figure out how to turn it on, says Gagliano. Most algorithms can’t do this kind of fuzzy reasoning, he says.

Share.
Exit mobile version