Edited by: Vahram Chavushyan, National Institute of Astrophysics, Optics and Electronics, Mexico
Reviewed by: Víctor Manuel Patiño Álvarez, Max-Planck-Institut für Radioastronomie, Germany; C. S. Unnikrishnan, Tata Institute of Fundamental Research, India; Anna Lia Longinotti, National Institute of Astrophysics, Optics and Electronics, Mexico
This article was submitted to Milky Way and Galaxies, a section of the journal Frontiers in Astronomy and Space Sciences
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
In the next 30 years, a new generation of space and ground-based telescopes will permit to obtain multi-frequency observations of faint sources and, for the first time in human history, to achieve a deep, almost synoptical monitoring of the whole sky. Gravitational wave observatories will detect a Universe of unseen black holes in the merging process over a broad spectrum of mass. Computing facilities will permit new high-resolution simulations with a deeper physical analysis of the main phenomena occurring at different scales. Given these development lines, we first sketch a panorama of the main instrumental developments expected in the next thirty years, dealing not only with electromagnetic radiation, but also from a multi-messenger perspective that includes gravitational waves, neutrinos, and cosmic rays. We then present how the new instrumentation will make it possible to foster advances in our present understanding of galaxies and quasars. We focus on selected scientific themes that are hotly debated today, in some cases advancing conjectures on the solution of major problems that may become solved in the next 30 years.
香京julia种子在线播放
The development of astronomy in the second half of the twentieth century followed two major lines of improvement: the increase in light gathering power (i.e., the ability to detect fainter objects), and the extension of the frequency domain in the electromagnetic spectrum beyond the traditional optical domain. Around 40 optical, ground based, reflecting telescopes of diameter larger than 3 m are operational at the time of writing: only eleven of them became operational before 1990, which means that 3/4 of the largest telescopes have been built in the last 25 years. The 14 telescopes of the 8–10 m class (counting 2 for the Large Binary Telescope, LBT, and 4 for the Very Large Telescope, VLT) all became operational around the year 2000 or afterwards, with the exception of the first Keck telescope (at Keck science observations began in 1993). Telescopes of the 6–10m class in space and telescopes of the 30–40m class on the ground belong to the near future: the James Webb Space Telecope (JWST) will be soon launched and the ground-based telescopes that are under construction are the ESO Extremely Large Telescope (E-ELT) and the Giant Magellan Telescope (GMT). The light gathering power is steadily increasing and will benefit spectroscopic studies of distant quasars, galaxies, and supernovæ.
The second line of development involved the extension over unexplored frequency domains of the electromagnetic radiation: from the tiny optical range from 3,700 to 8,000 Å, ground and space based instrumentation now yields the ability to cover the electromagnetic spectrum from meter wavelengths to the γ-ray domain. If there is a safe prediction is that the progress will follow along the line of a deeper, wider and faster coverage of cosmic sources at all accessible wavelengths of the electromagnetic spectrum.
A third line—the increase in resolving power (i.e., the ability to resolve finer details of distant objects
There is however a most exciting development that will break the almost complete monopoly of electromagnetic radiation as carrier of information from extragalactic sources. The detection of
At the same time new types of telescopes and instruments will be soon in operation for the detection of neutrinos and cosmic rays.All these endeavors contribute to
An eminent physicist foresaw that we will be able to have a complete account of the reality, from the Big Bang to humans, in physical and chemical terms, within this century. Although this idea is perhaps too optimistic, it is reasonable to presume that we will have a reasonably complete view of the constituents of the visible Universe and of its evolution from the dark ages (beyond
Once an overall physical understanding will be reached, we expect that science will progress toward the explanation and modeling of finer physical details. This has been the case for the study of stars in the twentieth century. The stellar evolution theory provides a detailed general physical framework with predictive power, although there are still many challenging aspects in the physics of stellar atmospheres and stellar structure (magnetic reconnection, flares, internal oscillation, internal turbulence) that are at the frontiers of present-day research. In this sense, we may hope to reach a global understanding of the nature of the active galactic nuclei
Several developments expected in the next 30–40 years are not so difficult to foresee, not last because the new astronomical instruments require careful planning that may imply at least a 10-year lapse between early proposals and first light at the facility. We will first review the major observational projects that are ongoing or planned and are expected to have frontier capability
Generally speaking, the most modern ground-based telescopes and even more so the forthcoming ones are best suited for spectroscopy to deep limits (currently 25.6 AB mag in the Z-band in 3 h exposure at LBT) or high spectral resolution (
Ground-based telescopes equipped with adaptive optics systems have also made great advances in high-resolution near-IR imaging. For example, GRAVITY, a VLT-based imaging IR interferometer is expected to reach resolving power of three milliarcsec (Eisenhauer et al.,
Space-based telescopes provide complementary capabilities. Since backgrounds are 10–100 times lower than at the ground, space offers a stable operating environment. Space-based observations reach the same broadband imaging depths of ground-based observations up to 100 times faster, achieve fainter magnitude limits with the next generation instruments (
Space telescopes also uniquely cover the information-rich UV, FIR, and sub-mm spectra that are blocked by Earth's atmosphere. Planned projects such as LUVOIR (France et al.,
Comparison between expected performance of LUVOIR, an optical space based telescope of 12m aperture
A most notable advancement should be the building of fully steerable 40m class telescopes in the optical and near infrared: the GMT and the ESO ELT. The first will consist of seven 8.4 m diameter segments with the collecting area equivalent to a 22.0 m single-mirror telescope (McCarthy,
The James Webb Space Telescope (JWST) is expected to be another milestone, providing a ten fold improvement in sensitivity and in spatial resolution with respect to the Hubble Space Telescope (HST, Gardner et al.,
The Euclid space mission, designed to study the baryonic acoustic oscillations (BAOs) from the large scale distribution of galaxies and quasars (see section 3.2.8 for a brief discussion) is expected to yield millions of moderate-resolution quasar spectra over the visual wavelength range (ESA,
The SDSS V should still leave the faint end of the quasar luminosity function sampled by LSST. It may however serve as a testing experiment for dedicated, larger aperture telescopes. The need to cover spectroscopically faint sources (
The obscured Universe is an important part of extra-galactic studies. The word “obscured” refers here to the optical/UV domain that is subject to extinction by interstellar dust
A powerful (albeit costly) strategy is to detect the track of γ-rays (as well as cosmic rays) through dedicated arrays of telescopes sensitive to the Cerenkov radiation emitted in the earth atmosphere. The Cerenkov Telescope Array (CTA), based on this approach, is expected to detect Cerenkov radiation from very high-energy (VHE) cosmic rays and γ-rays. The CTA, an array of ~100 built-for-purpose optical telescopes, should be able to cover a huge range in photon energy from 20 GeV to 300 TeV, and be a factor ~103 more sensitive on hour timescales than the space-based Fermi Large Area Telescope (LAT) at 30 GeV. The angular resolution of CTA will approach 1 arc-minute allowing detailed imaging of a large number of γ-ray sources. CTA will be the first VHE observatory that will reach the angular resolution needed for easy cross-identification of optical counterparts (Cherenkov Telescope Array Consortium et al.,
The Atacama Large Millimeter/submillimeter Array (ALMA) is an aperture synthesis array with 66 radio telescopes for sub-/millimeter astronomy. ALMA bands cover from 30 to 1,000 GHz (300 μm), with typical resolutions of a few hundredths of arcsec (Testi and Walsh,
Currently, the Very Large Array-based FIRST (Becker et al.,
The RadioAstron telescope is an array of ground based telescopes with an antenna of 10m diameter in space (with a baseline of 350,000 km) that achieved the unprecedented spatial resolution of ≈ 35μarcsec at 6cm. RadioAstron is a successful effort that demonstrated the feasibility of VLBI with space antennæ. It is reasonable to think that by 2050 more sensitive arrays with more antennæ in space will be operational. Space based radio-antennæ obviously yields an extension of the baseline, but can also take advantage of a much lower background noise that allows for a much wider dynamic range (especially helpful when trying to map a boosted jet and much fainter unboosted, extended features, a typical condition in extragalactic radio sources).
A poorly sampled radio frequency domain corresponds to the meter wavelength domain. There are several radio telescopes operating at low frequency, among them the Giant Metrewave Radio Telescope (Ananthakrishnan,
Since spatial changes induced by gravitational waves occur with opposing sign for orthogonal directions, a Michelson interferometer has a well-suited geometry to maximize the tiny effect on the detector. Laser interferometry is employed for motion sensing (Hough et al.,
S/N levels as a function or redshift (left scale) and luminosity distance (right scale) and of total source frame mass for the baseline configuration of LISA, for a fixed mass ratio of
A complementary technique to detect low-frequency gravitational waves is to consider an array of millisecond pulsars and measure the pulse arrival times. Differences in arrival times (the timing residuals) should be correlated if produced by gravitational waves (Hobbs et al.,
The CTA will detect Cerenkov photons emitted by the cosmic rays before their first interaction with the atmosphere. While the CTA will remain state-of-the-art in the next decades for atmospheric Cerenkov CR detection, it is expected that other type of CR detectors may undergo significant developments. For instance, a large collecting area can be obtained exploiting Cerenkov radiation emitted in dense media. In this case, the detection of Cerenkov flashes is achieved by using photomultipliers submerged in water tanks over a large surface area, as in the case of “HAWC” (Carramiñana and HAWC Collaboration,
Cosmic ray detection from extra-galactic sources has been a serious problems in the past decades. Low energy (≲ 1 GeV) cosmic rays are frequent (1 event/s/m2) and their detection straightforward with fog chambers. The cosmic ray energy spectrum (Beringer et al.,
Neutrinos offer a unique diagnostic of extremely high energy processes, and they can, unlike cosmic rays, travel unimpeded across the magnetic field of the Galaxy. Given the weak interaction of neutrinos with matter, large masses are needed to reveal neutrinos. Mechanisms for the production of high energy neutrinos will also produce γ rays of similar energies. γ-ray telescopes such as CTA are expected to achieve precision pointing and sensitivity to identify populations of accelerators.
A common type of design of a neutrino observatory involves an array of photomultiplier tubes housed in transparent containers which are suspended within a large tank of pure water or ice (or other suitable materials) and aimed at the detecting the Cerenkov radiation due to leptons (typically muons) or to other decay products induced by the interaction with the neutrinos. Over the years, neutrino observatories have employed larger and larger volumes placed underground, to improve the detection rate and lower the background. The IceCube Neutrino Observatory is a cubic-kilometer detector that uses ice as a medium which detects Cerenkov radiation through an array of photomultipliers. At present, IceCube has detected ≲ 100 PeV neutrinos of astrophysical origin. The PeV detections of IceCube might be substantially increased by a second-generation observatory, IceCube-Gen2 which should be based on a 10 km3 volume of ice at the South Pole (IceCube-Gen2 Collaboration et al.,
Increase in computing power cannot follow the Moore's law forever. The Moore's law posits that the surface density of transistors on integrated circuits doubles approximately every 2 years. However, the limiting size of the technology is about 2–3 nm (down from about 14 of today, which may still implying a more than tenfold increase in computing speed), and may be reached around 2025. At present it is not clear what may follow. The clock speed saturated at about 3GHz because faster speeds produce too much heating (Evans,
Large surveys and simulations will provide data in the order of the petabyte; SKA is expected to store more than 1 petabyte of data per day. These “big data” will require
Major surveys are expected to produce huge amounts of reduced data as well as public-domain Virtual Observatory (VO) compliant catalogs of measurements. The analysis of large amounts of data provided by instruments from the new generation of telescopes and numerical experiments is expected to become increasingly cumbersome for human researchers. Ultimately, neural networks and other forms of artificial intelligence may become a necessity to manage the sheer amount of data. Deep machine learning has been considered for applications to astronomy since the late 1980s. A convolutional neural network is a class of deep, feed-forward artificial neural networks that has successfully been applied to analyzing visual imagery. Deep neural networks are being exploited for a host of problems associated with visual morphological classification, a frequent necessity in astronomy. For instance, they proved to be very effective in evaluating galaxy morphology, and extracting morphological parameters such as Sersic index and isophotal magnitudes (Tuccillo et al.,
Summing up the previous discussion of instrumental capabilities, we can say that we can expect the ability to probe much deeper than today with planned instrumentation. Observations with active optics may become commonplace with the largest telescopes, yielding an overall improvement of a factor ~10 in resolution for “every night” observations in the optical and NIR domains. The largest telescopes are expected to reach magnitudes ~30, with a more-than-tenfold improvement with respect of today. In ranges where spatial resolution has been poor (low-frequency radio, hard X-ray, γ-ray) improvements are expected to bring the resolution ~1′. This means that, for example, bright optical extra-galactic sources could be unambiguously identified with their γ counterparts in the wide majority of cases. We can now focus in more detail on the possibilities that the new instrumental capabilities will offer in the study of galaxies and quasars, and nuclear activity in general.
Imagining the future of our understanding of galaxies is not an easy task, in particular for researchers who formed their background of astrophysics during the epoch of transition from photographic plates to CCDs, when the only big telescope was the 5m Hale in Palomar and few space missions were already lunched (e.g., IUE, Uhuru, Ariel 5) and ROSAT as well as HST were still to come. At that time radio observations already revealed the spiral structure of our Galaxy and the first radio sources were identified with optical counterparts. This is the epoch when computers started the first data reduction of astronomical images and spectra and the most-widely used compiler was Fortran 77. Galaxies were singularly studied through deep CCD images or photographic plates (micro-densitometers were still largely used) with optical filters (B and V Johnson) or long slit spectra, and the first numerical models started to appear in the astronomical literature.
The progress in all fields of astronomy has been so far-reaching during these 30 years that a single researcher could not be up-to-date of the whole literature with the exception of his/her specific interests. When we think that only 100 years ago the humankind was not aware of the existence of galaxies, we are legitimate to feel very happy of being part of such a fantastic development of our field.
Coming back to the theme of this review, the first step to imagining the future of the study of galaxies is to keep in mind first why we study galaxies. The second step is to make a list of the hottest research topics in this field. The first item is necessary because it acts as the helm of a ship. Remembering why we study galaxies it is important to stay on course, to follow the aims of our projects. The second item is fundamental because we plan today our future researches and this implies to know many things, last but not least how much they cost in terms of economical resources.
Why do we study galaxies? Galaxies are the largest gravitational bound systems where stars are “organized” to trace the baryonic matter in the Universe. If our aim is to reconstruct the history of the Universe, we must understand how such organization of stars in galaxies changed during the cosmic epochs. This means to examine how stars are distributed at all spatial scales, how and when galaxies and stars formed and in which way the population of stars evolved. With such motivation, we can easily predict that the focus of our future researches will be to understand the origin of galaxies. This implies that most of the efforts will be dedicated to the studies of high redshift objects.
When dis galaxies emerge from the dark era? Which kind of stars formed first? How long was the epoch of re-ionization of the Universe and what kind of sources contributed to it? Were the first galaxies similar in structure and shape to those we see at low
Up to now coordinated efforts exploiting multi-wavelength observations have permitted to trace a preliminary picture of the evolution of the star formation in galaxies. In particular we have measured the star formation rate density (SFRD). This is the rate at which stars formed within galaxies in comoving volumes of the Universe. Most of the merits of such achievement can be attributed to space missions like HST, Spitzer, Herschel, and Galex. The surveys carried out with these telescopes permitted to acquire a large database of galaxies observed at different cosmic epochs. We should not forget however the important contribution of the optical surveys at smaller redshifts, like e.g., the SDSS. The SDSS sample contain ~300, 000 galaxies brighter than
Madau and Dickinson (
The observed star formation rate density in galaxies at different redshifts (Credit to Madau and Dickinson,
The observed trend and the well-known
In this respect the recent work by Chiosi et al. (
The history of the BH growth compared with that of stellar mass growth (Credit to Heckman and Best,
The idea about the formation and evolution of galaxies is that stellar systems grow primarily for the accretion of gas from the cosmic web. Major mergers of gas-rich systems happen and provide strong bursts of star formation, but do not seem to contribute to the bulk of star formation (they might do so for ~10%). This seems consistent with numerical simulations and with observations of the star-forming galaxies (see e.g., Dekel et al.,
Clearly the possibility of making significant progresses in this area is closely linked to our ability of planning new powerful telescopes for the ground and space. Among the various projects that will have a big impact on the next 20 years we should mention the JWST and the ELT whose construction has started in the Chilean desert. The groud-based telescopes will in fact largely benefit of the progresses made in adaptive optic systems, like e.g., MAORY.
In the next decade high redshift observations would likely permit to formulate a coherent picture of galaxy evolution linking the data available for the different cosmic epochs. What is important is to establish which physical processes play the major effects and are responsible of the major transformation observed in size, morphology and stellar population content.
By studying the earliest galaxies JWST and ELT will contribute to understand how galaxies grow and evolve. These telescopes will gather data on the types of stars that populated the very early objects. The spectroscopic follow-up observations will help to clarify how elements heavier than hydrogen formed and built galaxies through the cosmic ages. These studies would also contribute to understand the role of merging among galaxies and to have a much better knowledge of the mechanisms of feedback from supernovae (SNe) and AGN.
The AGN feedback has characterized the evolution of galaxies in particular during the epoch of major interest for the stellar population evolution in galaxies at
A further step forward requires the analysis of data coming from large optical and spectroscopic near infrared surveys. In particular it will be extremely important to address the nature of galaxies fainter than
The first galaxies with redshift larger than 1 were discovered with a color selection technique (see Steidel et al.,
Morphology of galaxies at intermediate and high redshift. Redshift is given by numbers in parentheses. (Credit to Buta,
The morphological transformation of galaxies is generally accompanied by an evolution in size of stellar systems. Galaxy size is typically measured through the effective radius
At high redshift the observed quiescent early-type galaxies have more compact sizes (by a factor of 3–5) than local objects (Daddi et al.,
The questions posed by these observations are therefore: how massive and compact systems could be already in place at early cosmic epochs in a hierarchical Universe where the large structures are the last to form, and what are their progenitors? What processes drive the evolution of these systems and the quenching of star formation? All these questions will find in the future planned sky surveys from ground and space, with billion of new data, the possibility of being answered.
Before the consolidation of the so-called “precision cosmology” that is today represented by the Λ-CDM model, according to which the Universe consists of 70% of dark energy (DE), 25% of dark matter (DM) and 5% of ordinary matter, galaxies were the main objects that enclosed the information on the cosmological parameters. Constraints on Ω and
A still open problem is that we do not know the nature of DM, although we can constrain its distribution through dynamical studies of clusters and satellite systems, working on the intergalactic absorption visible in the spectra of high redshift objects, and studying gravitational lenses.
These are particularly promising since future large scale imaging surveys will likely increase the number of strong lensing candidates. These objects are difficult to find, but great results are expected from the automated search methods in rapid development (see e.g., Alard,
The SKA, LSST and the Euclid space telescope will likely increase the number of lenses by orders of magnitude (see e.g., Oguri and Marshall,
Today, despite the past efforts, there is still a large uncertainty latitude concerning the mass of the DM halos (see the contribution of P. Kroupa in D'Onofrio et al.,
The situation is promising in particular because astronomers have learned how to model the behavior of the DM component with N-body simulations on large computers. Unfortunately, the behavior of the baryonic component is complex. This will be a key question for the future. The problem is the large dynamic range of the baryon interactions, from the scale of stars to that of galaxies. What we know is that baryons collapse in the DM halos forming the first stars and that the gas often feed large super-massive BH at the center of galaxies, originating enormous feedback effects in terms of energy and matter moved all across the galaxy body.
The future astrometric mission Theia is aimed at probing the dark matter distribution in galaxies and the power spectrum of density perturbations. Theia could permit a detailed study of the shape of the dark matter profiles (core or cuspy) that are known to depend on different processes induced by the baryon physics, such as star formation, self-interaction, BH growth, etc. (Read et al.,
The standard hierarchical model of galaxy formation and evolution has permitted up to now to follow the evolution of the cosmic structures, to observe the creation of the first galaxies up to the appearance of the galaxies we see today. The “Illustris” project is one of these large-scale cosmological simulations (see e.g., Springel,
Despite the success of these simulations a number of severe problems still affect galaxies and the structures that are reproduced. One is that it is difficult to form realistic disk structures (the so-called angular momentum catastrophe). Another one is that the amount of stars that can be predicted with a simple physical receipt is largely overabundant with respect to what observations tell us. There are today a number of tensions between theory and observation that will likely characterize the future epoch. Probably many of these are due to the fact that we do not know so well the complex physics of baryons.
In any case the enormous growth of numerical simulations will likely characterize the years to come. At present we are still testing the power of simulations in representing reality. The complexity of the problem does not permit to numerical calculations to fully capture the correct answer across all scales of space and time. The finite resolution, i.e., the size of the smallest details that can be reproduced implies that some processes, such as the birth of individual stars, cannot be followed by cosmological simulations. As a consequence many physical approximations are necessary to accomplish the whole simulation.
The expected increase in computing power will certainly help numerical simulations and will be particularly useful, when managing the enormous databases of galaxies at different redshift that the various projects have in program. New statistical approaches to the data should be adopted to extract the driving processes of galaxy evolution. It is well-known in fact that galaxy properties are mutually correlated: e.g., mass correlates with color, morphology, metallicity, SF rate, gas content, etc. and that galaxy environment also plays a role. It is therefore far from being simple to recognize the paths of evolution. In this respect the cladistic approach (see Fraix-Burnet et al.,
The phylogenetic analysis provides information equivalent to that of “scaling relations,” but in a larger space defined by the number of parameters. The classical 2D or 3D scaling relations, that identify some paths of evolution. For example the mass-metallicity relation constrains the amount of gas inflow and outflow during the cosmic epochs (see e.g., Hartwick,
Figure
The phylogenetic tree for the galaxies of the WINGS survey. Colors mark the different groups defined by the cladistic analysis on a set of parameters which includes color index, absolute magnitude, effective radius, sersic index, among others. From Fraix-Burnet et al. (
A further step forward for our understanding of galaxies will be obtained when we will be able to trace the behavior of the cold neutral and molecular gas up to the first cosmic epochs. The drivers of LOFAR and SKA are the capability to probe deep into the redshift range of the reionization epoch from 6 to 20, mapping the formation of massive galaxies, clusters and black holes using
The maps provided by e.g., ALMA and SKA will be important to understand the relation between SF, gas density and kinematics and could contribute to clarify the mechanisms of starburst and AGN activity with the associated feedback processes. The ALMA data for our own Galaxy will also provide the opportunity of resolving the gas transformation in the vicinity of the central BH. The environment around a BH is very poorly known. What we know is that the central region of galaxies are very dense and are dominated by gas cloud collisions and strong magnetic fields. The knowledge of the chemical enrichment in this region is crucial to understand the origin of BH itself. At the time of writing this contribution, strong molecular outflows have already be measured by ALMA using e.g., the CO lines and other molecular transitions. All these data will provide a significant advance for our understanding of the feedback process and the inter-stellar medium (ISM) enrichment.
For neutral Hydrogen, SKA with its unprecedented sensitivity will certainly contribute to create a better defined picture of the formation and evolution of the first stars and of the galaxies after the Big Bang, and will provide important information on the role of the cosmic magnetism, as well as on the nature of gravity, and possibly even on the existence of life beyond Earth. Hydrogen is the most diffuse element in the Universe and we can exploit its distribution to afford one of the mysteries of the current cosmology: the nature and the role of dark energy. DE is responsible of the observed acceleration of the Universe, but its nature is unknown. The next 50 years will likely be dedicated to solve the puzzle posed by the current cosmological model. SKA should be able to detect the young forming galaxies at very high redshifts, so that HI maps might include million of galaxies. The origin and evolution of cosmic magnetisms will be one the key researches of the new astrophysics that can change the future of our understanding of galaxies.
The nature of the DE can in principle be constrained by reconstructing the cosmic expansion history and the linear growth of cosmic structures. In this context the future ESA mission Euclid, by mapping billion of galaxies, will be able to provide the geometry of the dark Universe and classical spectroscopy is still mandatory to check for systematics effects in all measurements. Cluster velocity dispersions also require precise spectroscopy to reconstruct their evolution and spectra will be fundamental to test AGN activity and systematic variations in the progenitor properties of SNe (a method that requires a good knowledge of metallicities, SFRs, and dust contents).
The large-scale photometric surveys used for example by BAOs or by the lensing statistics also require a precise spectroscopic calibration. Baryonic Acoustic Oscillations (BAOs) are regular density fluctuations of a fluid of baryonic matter and photons present in the primordial Universe during the clustering of structures. Pressure generated expanding sound waves were imprinted on this fluid. With the expansion of the Universe the expansion of the pressure wave stopped and photons streamed away while BM and DM locked together for gravitational attraction. This gave rise to the acoustic peak visible in the data of the SDSS and 2dFGRS as a characteristic scale bump of galaxy clustering in the power spectrum (Cole et al.,
Figure
The early stages in the evolution of AGN and quasars: merging and strong interaction lead to accumulation of gas in the galaxy central regions, inducing a burst of star formation
The sketch of Figure The connection between black hole growth and the build-up and evolution of galaxies, which involves the interplay between accretion and star formation. The first side of the issue is what the physical conditions (e.g., fueling mode, triggering mechanism) that initiate major black hole accretion events should be. The second side involves the mechanical and radiative output of the quasar (understood as an accreting black hole). What is the nature of AGN feedback? The accretion process itself. The basic process of accretion is self-similar although it may take different forms as a function of accretion rate, black hole mass, and spin, and these parameters are expected to be not only a function of cosmic epoch, but of environment as well (for example, merging leading to a sizable population of massive black hole binaries).
These issues are addressed by detailed studies of both nearby and distant SMBHs, and will of course benefit from the wide array of instruments providing very high spatial resolution from ground (active optics; interferometers) and space.
ALMA can locate star-formation activity hidden by dust, and identify spectroscopically the cooling of molecular clouds with primordial chemical composition. ALMA is a powerful tool with the potential of clarifying the inter-relation between star formation, metal enrichment and SMBH accretion-induced activity. As an example of an application of the ALMA data, we can consider the [CII] 158 μm line that is strong in star-forming galaxies, and is the dominant cooling mechanism for cold interstellar gas. Kimball et al. (
The radio surveys with SKA (Diamond,
The existence of a SMBH (J1342+0928,
In the standard big bang cosmology, the Universe baryonic matter, following the hot phase after the big bang, should have undergone a rapid cooling, and have become mostly neutral. It may have remained so until the first accreting black holes and the first shining massive stars may have produced enough radiation to reionize it (Jiang et al.,
Determining the relation of star formation (a Population III of stars) and accretion (onto a direct collapse black hole?) and reionization during early cosmic epochs will connect the first light sources to the processes that assembled galaxies after reionization. Quasars are apparently not enough for the re-ionization of hydrogen: the number of ionizing photons from the luminosity function of z ≈ 6 is apparently insufficient to keep the Universe ionized, given also that the soft X-ray background sets limits on accretion power at high redshift (McQuinn,
Near- and mid IR spectroscopy in addition to X-ray observations (discussed in mode detail below) are crucial to understand the quasar accretion properties. JWST and E-ELT will be suitable instruments to characterize the first luminous sources, in order to reconstruct the ionization history of the early universe, and to analyze how AGN and star formation evolved from the epoch of reionization to the present day (Gardner et al.,
The redshift frontiers depend on luminosity. Relatively low mass black holes radiating at modest Eddingon ratio remain undetectable at high
The most important epoch for investigating the relation between accreting black holes and galaxies is the redshift range 1–4, when most black holes gained most of their masses and when most accretion power was released. X-rays are well-suited for studying in detail black hole feedback, although they are only one of the many spectral ranges that need to be covered to get a complete view of the phenomenon. Feedback is a process that ultimately originates in the innermost regions close to the supermassive black hole and is dominated, in terms of energy and mass flow, by material over a wide range of ionization stages. Current studies of the incidence, nature and energetics of AGN feedback are mainly restricted to the local Universe (with only very limited knowledge on the most deeply enshrouded (Compton-thick) black hole population), but systematic studies of AGN feedback to
A crucial process is the phase in which quasar winds “invade” the host galaxy (mechanical feedback). According to models, quasar outflow rates may reach thousands of solar masses per year at high
A second key question is: how do accretion disks around black holes launch winds/outflows, and how much energy do these carry? The answer to this question suffers because of the poor understanding of the structure and dynamics of the broad line emitting regions, within 104 gravitational radii from the central black hole. Certainly, not all quasars show powerful winds able to influence the global evolution of their hosts. Some authors distinguish between wind- and disk-dominated quasars (Richards et al.,
Even if there is agreement about the existence of an accretion disk and convincing evidence of outflows, the launching mechanism and the physical processes involved are only crudely understood today. Significant progress should come not only from multi-frequency simultaneous observations (optical, UV, and X) but also from SPH hydrodynamics simulations that are expected to improve in numerical sophistication and in the treatment of physical processes (see e.g., Sa̧dowski et al.,
Variability at all wavelengths is one of the defining properties of AGN. The most rapid variations in γ-rays are on the scale of only a few minutes. The very rapid variability of flares puts strong constraints on the size of the emitting region and its bulk velocity due to light crossing-time arguments. However a fundamental question such as: “what causes the observed variability in AGN from time scales of a few years down to a few minutes?” remains without convincing answers at present. As for the analysis of single-epoch spectra of individual quasars, the potential of spectral variability to constrain quasar models has not been sufficiently explored. The planned “panoptic” SDSS-V is intended to exploit this potential on wide scales. The SDSS-V plans to do spectroscopic reverberation mapping sampling hundreds of epochs for ~103 quasars (0.1 <
Changing-look AGN, in which the broad lines in the AGN spectra either appear or disappear (i.e., passing from type-1 to type-2, or viceversa), an extreme case of line profile variability (LaMassa et al.,
The new capabilities offered by multiplexing spectrographs as well as a significantly longer temporal baseline for the monitoring (periods—often corresponding to the dynamical timescale of the BLR—are of the order of tens of years) will likely lead to an assessment of the prevalence of supermassive binary black holes in the local and in the remote Universe, as well as of the interplay magnetic fields/viscosity/turbulence in accretion disk. Astrometric measurements with resolution ~1μarcsec have the potential to resolve sub-pc binary black hole systems up to high redshifts.
The gas in the accretion disk may lose up to almost half of its energy within 1,000 gravitational radii, resulting in powerful UV and X-ray emission. The strong gravity field implies that general and special relativity effects are detectable from the emitted radiation not only in the hard X-ray domain but in the optical and UV as well. The close proximity to the event horizon is where differences in the spacetime metric due to the black hole rotation become appreciable. Lense-Tirring precession may be at the origin of warped disks in the case the angular momentum of the accreting material is misaligned with the spin angular momentum of the black hole (Bardeen and Petterson,
The ISCO of the accreting gas also depends on the black hole spin, being closest to the black hole for maximally rotating black holes. In other words, the determination of the ISCO is a key endeavor because it is an indirect measurement of the black hole spin. This has important consequences for the radiation emitted by the accretion disk which is hotter in the case of high spin (Wang et al.,
LISA design is suited to detect the signal for coalescing black holes with masses ≲ 106 M⊙ in the source frame up to
In radio-quiet quasars, X-rays are produced by Comptonization of thermal disk photons in a hot corona. Among radio-loud quasars, photons from the radio jet also contribute as seeds in the inverse Compton scattering process (Pian et al.,
It is not clear what the hot corona might actually be: a compact “sphere” (as it is often modeled) or clumps above the disk that illuminate the disk? How does the corona depend on the accretion status of the SMBH? The time lag between changes in primary radiation emission from the corona and the reprocessed emission from the disk provides a tool to measure the distance between corona and illuminated disk, as in the case of optical and UV reverberation mapping. Results on the Kα response of several nearby AGN are already available (Kara et al.,
The radio source Sgr A* at the center of the Galaxy, and interpreted as the radio emission from a supermassive black hole, shows a compact and diffuse morphology at high radio frequencies above 200 GHz. The prediction is that a global VLBI array of radio telescopes, the global Event Horizon Telescope, could detect the shadow of the event horizon (Falcke,
Radio-loud AGN are producing collimated relativistic outflows by a still poorly-understood process. Acceleration occurs extremely close to the SMBH (to explain remarkably short variability timescales), within a few tens gravitational radii, but what are the sufficient conditions for an efficient accelerations to ultra-relativistic speed (Lorentz factors ≳ 10 – 100)? This may indicate indicate a Lorentz factor much larger than previosly thought, or hadronic acceleration. Very and ultra high energy (VHE and UHE) observations are the best tool to probe the physics of jet formation and the interaction of the black-hole magnetosphere with the accretion disk corona.
The SED of bright blazars is well explained by leptonic emission scenarios, where the radiative output throughout the electromagnetic spectrum is assumed to be dominated by electrons and possibly positrons (Celotti and Ghisellini,
Radio-loud quasars are one of the likely sites of the acceleration of UHE CRs, with energies up to around 100 EeV. γ-ray and neutrino observations also allow to search for UHECRs. γ-ray imaging observatories such as the CTA are expected to explore with unprecedented sensitivity the γ rays in the energy range from 50 keV-2 MeV which are the best tracers of CRs: low event statistics and deviation of charged particles in extra-galactic and Galactic magnetic fields make it difficult to direct search for UHECR sources. The γ-ray observations are expected to identify beacons (the γ RL quasars) that track the cosmological evolution of black holes down to the epochs of galaxy formation. Gravitational wave, γ and hard X-ray observations could provide a solution of the long standing problem of the energy source of reionization, and of the role of accreting black holes in the formation of protogalaxies.
The quasars spatial distribution has been used as a tracer of large scale structures and BAOs (e.g., Zarrouk et al.,
Quasars have a tremendous potential for cosmology, but their potential is as yet unexploited since they are not standard candles in conventional terms. Early efforts to establish correlations between luminosity and one or more parameters (for example, the equivalent width of high-ionization lines, the so-called Baldwin effects, Baldwin et al.,
Table illustrating the foundations of several methods that have been proposed to exploit quasars as distance indicators.
Extremely radiating | Hard-X-ray slope | Wang et al., |
xA, V | |
quasars (xA) | velocity dispersion | |||
Extremely radiating | Virial velocity dispersion | Marziani and Sulentic, |
xA, V | |
quasars (xA) | FWHM(Hβ), |
|||
General quasar | X-ray variability, | log |
La Franca et al., |
V |
population | velocity dispersion | |||
Mainly quasars | Reverberation mapping | Watson et al., |
||
at |
time delay τ | Czerny et al., |
||
General quasar | Non-linear | log |
Risaliti and Lusso, |
|
population | UV-X retation |
At very high accretion rate, the luminosity-to-black hole mass ratio (
Highly accreting quasars can be considered as “Eddington standard candles:”
Distance modulus of the “extreme sample” of Negrete et al. (black dots) and the Kessler et al. (
The BLR size has been suggested as standard ruler, in a way that is conceptually analogous to the BAOs. The cross-correlation function between the continuum and the emission line light curve measures a time lag τ, meaning that the distance of the BLR from a central continuum source can be written as:
The non-linear relation between soft X-ray and UV has also been used to build a Hubble diagram (Risaliti and Lusso,
For the first time in human history, the next decades will see the ability to cover the sky in a panchromatic fashion, with a resolution clearly variable across the electromagnetic spectrum, but sufficient to resolve at least the brightest extragalactic sources from the low-frequency radio to the γ-ray domain. This ability will be enhanced by “synoptical monitoring” capabilities, at least in the visual bands. Optical sky surveys will make data available for sources down to ≈ 28 mag, and for brighter but still very faint sources (
We are confident that several of the main issues that are hotly debated and that need observational and computational improvement will become if not fully settled, at least better understoood: (1) the role of nuclear activity on the host galaxy evolution, over a broad range of redshift. This fundamental issue will benefit from the ability to trace nuclear activity phenomena in obscured source of radio, mm, X-ray and gravitational wave observatories; (2) the reionization main players at the redshift frontier; (3) the inner structure of quasars involving the emission line region physics and dynamics, including the disk wind physics and modelization; (4) the real nature of the massive compact object in the nuclei of galaxies. We may obtain a final answer to the question: is it really a black hole? (5) The origin of the relativistic radio jets and the mysterious high-energy phenomena occurring in AGN should become more constrained by the radio, X and γ ray observational developments. (6) Last, the possibility that quasars may be exploited as distance indicator will be certainly explored by several groups.
The same could be said for what concern our understanding of galaxies and clusters. Multi-messenger data will likely allow to map the star formation history of galaxies down to the first epochs, in close connection with the development of SMBHs. Most of the questions outlined before will find possible answers. Thanks to the enormous mass of data for billion of galaxies we will also have in a nearby future a much better understanding of the large scale structure of the Universe and in close connection with this, it will be likely possible to clarify which is the nature of DM and DE.
We should not be oblivious that many of the advancements hypothesized in the previous sections depend on the preservation of the social and economic conditions that should make possible for science to progress. They shall also imply vast educational efforts. However, if the advancements progress as expected, the amount of data and model sophistication may appear overwhelming. Large size elite collaboration may monopolize the frontiers fields of astronomical research, with costly dedicated instruments. Will there still be a place for the work of amateur astronomers and of citizen scientists?
We may even ask whether there will be anything left to discover? Will astronomers be reduced to priestesses and priests of a static wisdom, just monitoring that nothing unexpected or unpredicted by models is happening? We believe there is a chance that this might happen, although not in the next 30 years and perhaps not even before the end of the century. As the boundary of humankind is going to spread beyond our home planet Earth in the next decades, we will be able to see further and further developments, and interferometers with longer and longer baselines, probing deeper and deeper into the dark ages at the cosmic frontiers. There will be still fainter sources that will escape detection, and spatial details that we will not able to resolve. As well as intrinsically stochastic processes that will be impossible to predict or model. And we may still face challenging aspects related to the inability to see beyond the cosmic horizon, if we attempt to analyze the global topology of the Universe (Luminet,
All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The reviewer AL and handling editor declared their shared affiliation at time of review.
1The angular resolutions is defined by the angular radius of the Airy disk which is given as θ ≈ 0.025λ1μ
2In the following we will use the term AGN to indicate all accretion-powered black holes, and “quasars” to indicate mainly high luminosity AGN. However, it is important to stress that there isn't any critical luminosity divide.
3Even if some of the proposed projects may not be completed, or developed as assumed here, we are confident that instrumentation with similar technical specifications will become operational over the next decades.
4Dark is used to indicate an early cosmic age at