The satellites are all designed and built at Surrey Satellite Technology Ltd. (SSTL) in the U.K. Through the support of the British National Space Centre, SSTL owns and operates the U.K. satellite in this constellation. Although its headline objective is to support the logistics of disaster relief, DMCiis main function is to provide independent, daily imaging capability to the partner nations, those being Algeria, Nigeria, Turkey, U.K. and China.
DMC satellites provide unique Earth Observation resources that enable daily revisits anywhere in the world. This is possible with only a few satellites as each one is designed to image a large area of up to 600 x 600 km. This greatly improves the value of the data as it often avoids the need for mosaics of images from different seasons.
All DMC Members agree to provide 5 percent of capacity free for daily imaging of disaster areas. This data is channelled to aid agencies through Reuters AlertNet in the beginning. The DMC Consortium has agreed to consider participation in the International Charter for Space in Major Disasters, contributing daily imaging capability to fill the existing 3 to 5 day response gap. UK-DMC also provides data through an ESA project called RESPOND. In addition, the DMC Members are interested in encouraging the use of DMC data for scientific and commercial applications.
The builder of the small satellites, Surrey Satellite Technology Limited, is a privately owned company, with ownership shared between...
- The University of Surrey (85 percent) SSTL originally started as a department of the University devoted to space and satellite research. The University saw the potential of a commercial enterprise to further develop its new satellite technology and incorporated Surrey Satellite Technology Limited (SSTL) in 1985.
- SpaceX (10 percent) SpaceX is a launch provider in the U.S., founded by PayPal co-founder Elon Musk in 2002. With a very similar approach to space exploration (combining low-cost, high-speed, reliable space technology), SpaceX is a natural partner for SSTL.
- SSTL staff (5 percent) Professor Sir Martin Sweeting and SSTL employees hold 5 percent of SSTL shares.
Believed to be one of the largest monetary rewards in history for any British University, was the April announcement that EADS Astrium had decided to purchase the University of Surrey shares of SSTL. This is now a fait accompli as the final blessing for this acquisition has been given by the European Commission. The final step for this union was the assurance of the antitrust regulators of the European Union to the blending of the former University of Surrey property with Astrium for an 80 percent stake. The actual financial considerations have not been revealed as of this writing.
Dr. Steve Mackin pioneered a new approach for deriving quality control indicators from Disaster Monitoring Constellation data. The new framework, which is being implemented by DMCii, holds great potential for quality control and consistency in multi-source imaging projects, such as the European Global Monitoring for Environment and Security (GMES), now known as Kopernikus. The European Space Agency (ESA) has expressed interest in the techniques that Dr. Mackin presented in his role as one of the U.K.s representatives in the Working Group for Constellation Calibration on the Committee on Earth Observation Satellites (CEOS). The first dedicated GMES satellites, Sentinel-2 and Sentinel-3, will demonstrate (at least in part) the new framework as a quality control measure for GMES.
Dr. Mackin commented, This has never been done before and its application holds great potential for projects where imaging is sourced from multiple providers and satellites. As a GMES contributor, DMCii has begun implementing this new quality control framework within the Disaster Monitoring Constellation to validate it for wider use.
The new framework provides a clearer quality statement with defined error budgets at each stage and hence identifies low quality data before it can be issued. The traceability of data is also improved, enabling the rapid identification of the processing area at fault.
Dr. Mackin states the proposed methodology holds many benefits for imaging users: It makes sense for any customer to request standardized quality control information from imaging suppliers. Only then can you be sure of the quality of your end product and its fitness for purpose. It also allows users to compare data across image providers in a fast and simple manner and determine who meets the users requirements at the lowest cost hence saving time and money for the end-user.
SatMagazine
Thanks for taking the time to talk with us, Dr. Mackin. We appreciate your insight. Please tell us what this new constellation calibration achieves.
Dr. Mackin
It makes sense for any customer to request standardized quality control information from imaging suppliers. Only then can you be sure of the quality of your end product and its fitness for purpose.
The system should allow for the first time, true traceability through the entire processing chain from data acquisition to higher level product generation with uncertainties described for each step of the process, by each step down to fundamental operations, with corresponding QC of the outputs from each of these fundamental operations. The amount of work to set this up in a modular form is huge, with even small processing chains such as those used by DMCii Ltd. having potentially hundreds of modules.
However, once created, it is simple to create quite complex QA/QC (Quality Assurance / Quality Control) chains for new sensors by re-use of the modules, much like objects in C++ programming. Currently, traceability is either limited to small parts of the processing chain, or not at all. For the first time we should be in a position to say that I have a determined uncertainty on any product and prove it without extensive validation exercises and hence directly cross-compare data sets and derived products with simple quality indices.
SatMagazine
What was the driving force behind this decision?
Dr. Mackin
The ideas have been developing slowly within CEOS WGCV (Commission on Earth Observation Satellites Working Group for Calibration and Validation) for many years and have been partially addressed within the level processors of the major space agencies. There is a growing statement from the agencies that there is a need for some form of traceability and quality statement for every EO data set. This is partially driven by the rapid growth of applications expected under the European Kopernikus (formerly GMES) initiative and in part by the requirement from the climate change scientific community for statements of uncertainty in their input parameters to improve model prediction performance.
Hence, there is a push from ESA in their statement for increased information on Third Party Missions (data quality, calibration, and so on) plus the requirement from Kopernikus for a quality statement for each data product. CEOS WGCV has taken up these concerns as it is driven by the agencies to a large degree. This led to the QA4EO (Quality Assurance Framework for Earth Observation data ) initiative developed by a small group including Nigel Fox (NPL) and Pascal Lecomte (ESA) which has as its basis guidelines which form the basis for traceability in EO data. This has been adopted by the CEOS WGCV and is discussed in some depth in the CEOS WGCV Cal/Val Portal at http://calvalportal.ceos.org/CalValPortal/qa4eoInfo.do
DMC has worked closely with ESA and NPL in supporting the activity. The biggest problem now (which was raised at the Avignon meeting of the CEOS WGCV at the beginning of October last year) was how to implement the high level QA4EO guidelines. At the same meeting, DMC presented information on how it has begun to take the first steps in implementation.
SatMagazine
Why has DMCii adopted this technique?
Dr. Mackin
DMCii was really one of the first companies to offer constellation data. As we operate a constellation of satellites, we are able to revisit locations on a daily basis to provide change monitoring and achieve cloud-free imaging over very large areas within a given timeframe which is impossible with a single satellite. As the company has matured and become involved in many high profile and demanding projects, we have developed our own measures and methods of calibration and quality control.
Through our work with ESA as part of the Kopernikus project, we identified the customers need for universal measures and procedures to assist with purchasing and operational decisions. Kopernikus is an ambitious European project that seeks to combine remote sensing information from many different suppliers to provide global monitoring information for environment and security services. DMCii has adopted this method for many reasons...
- Satisfies ESA TPM and Kopernikus requirements on quality information
- Provides the means for an automated QA/QC system including automatic intervention if system parameters are exceeded
- Provides a means of simulation of new sensors and new methodologies, which can be included rapidly in the proposed modular system.
- Guarantees traceability through the entire system
- Helps identify those processes that add uncertainty to the final data product that can be targeted for replacement in future modifications to the processing chain.
- Provides a rapid means of developing
- future QA/QC systems for new satellites.
- Is needed as the constellation expands to provide an automatic means of managing data over very large constellations, which may have mixed system characteristics.
SatMagazine
How will the framework be applied to ESAs own Sentinel satellites?
Dr. Mackin
ESA is proposing to incorporate the ideas for use in both Sentinel-2 and Sentinel-3. The difficulty in this case is that the contracts for the level processors have already been assigned. Hence, the QA/QC system will have to sit outside the level processor and interface it. This is not the ideal way of developing the system. However, by interfacing the level processor correctly it should be possible to have a parallel chain that can provide the QA/QC information without impacting too much on the current developments.
The big difference with the Sentinel Missions is the much larger number of modules required, which means a slow implementation covering several years. However, the development is in its early stages, so we need to wait to see the benefits for the Sentinel Missions. In future, it would be expected to have a much tighter integration of the QA/QC system and the level processor.
Note, however, that the level processor is a temporal sequence, while the QA/QC structure is not. The QA/QC shows the uncertainty flow for the whole temporal sequence, but, for example, a QA/QC sequence can be Get Dark Current Data followed by Get White Diffuser Measurement, while in reality these two measurements can be separated by a long time interval of minutes to days. For QA/QC purposes they form part of a single sequence.
SatMagazine
How does it work?
Dr. Mackin
The system is entirely modular. Every process in a processing chain is identified and a module created for every independent measurement. By independent, we mean the whole measurement can be encapsulated in a single module. Modules can be aggregated into larger management modules to make the system more manageable. For each single module, there is a description, a protocol, a reference standard (if required), and an uncertainty budget. This is essentially the QA element of the process. The protocol needs to be accepted to some degree (certified for want of a better word). Outside of the module sits a corresponding QC element that tests that the output from the module meets the QA within the module in terms of uncertainty.
The QC element, in theory, can contain feedback actions in cases where the limit set in the QA is exceeded, even to the point of modifying the process. An example might be dark current measurement for calibration. Dark current tends to increase with time. In theory, if an upper limit was set in terms of QA for the noise component, then this could be exceeded with time. However, again in theory, with an automated system the QC could detect an out of bounds condition and modify the process to increase the number of dark image lines taken. This reduces the dark current noise component back into the defined limits for calibration. This is a simple example of how the system could automatically control the operation.
Additionally, by using parameter files and test data, it would be possible to simulate an instrument QA/QC flow and predict the output data uncertainties, even prior to the instrument being physically created. As the instrument is developed, changes to parameters or processes can be substituted with no other impacts to the system as it is entirely modular.
SatMagazine
What makes this different to current quality control measures?
Dr. Mackin
Current QA/QC measures do not provide traceability throughout the whole system. Normally, they consist of QA/QC applied to particular steps in the processing, often with little or no quantification of the uncertainty at each stage of the processing.
The QA/QC can be patchy with limited justification for the choice of protocol in certain stages. For example why are 512 white lines used for calibration of MERIS, but 1024 dark lines used for dark current estimation. There is no real justification for the choice of numbers and no quantification of the residual error in the choice.
The overall aim would be to apply guidelines at a high generic level (based on those developed for QA4EO) and from these develop a set of generic (high level) modules which apply to a particular sensor type. At this level, these generic processes can be certified and form the basis for the development for physical modules that address the behaviour of the actual system and the processing of the data from that system. These physical modules could, in theory, be very different from operator to operator, but they will both follow the generic guidelines and hence be equivalent. For the physical modules, there will be a well defined uncertainty budget and corresponding QC which turns the certified generic into reality.
SatMagazine
Does the QA/QC method only relate to constellations, or can it be applied to satellites such as Landsat?
Dr. Mackin
The point in this whole process is diversity of solutions but within a well-defined generic set of guidelines. We do not wish to prevent innovation. Each protocol implementation may be very different with different uncertainties. For example, DMC may have uncertainty at the 4 percent level in a radiance product, while Landsat may achieve 2 percent. However, the implementation from DMC may be simpler and meet the requirements of its customers while that from Landsat may be far more complex to produce a lower uncertainty in line with a proportion of its customers.
It has been agreed generally that multiple solutions to the same measurement problem exist and all we are saying is that we require the uncertainty on those solutions to give to the end-user, so they can weigh up perhaps the lower costs and higher uncertainties on one product against the higher costs with low uncertainty of a similar product for a specific application. Applications tend not to be uniform in their needs, so a higher uncertainty may be acceptable in many circumstances. The user can make the final selection. Its really about standardising the quality measurement of data both within a constellation and between imaging sources. That having been said, projects using Landsat data in combination with other imaging data would benefit from such a QA/QC framework. This would enable buyers to compare data with that from other providers.
If you consider that within GEOSS the aim is to have virtual constellations with satellites from different countries with different characteristics, working together, as well as physical constellations such as DMC or Rapideye, then the method is equally applicable. The aim is that for each satellite in the virtual constellation we can know the uncertainty in measurement rather than estimate it using large validation campaigns. We can then compare the output from each sensor and choose those most useful for a specific application.
SatMagazine
You mentioned traceability why is this important?
Dr. Mackin
Currently there is no true traceability in EO data and there are many assumptions on the uncertainties within the system. Many areas of uncertainty are poorly explored. The whole process is validated by using post-launch validation procedures against other ground collected data to determine the uncertainty on the final data product. This has limitations in that this is a combined uncertainty, and the causes of the level of uncertainty are not known in many cases and hence can not be reduced. With traceability, we know the uncertainty at each level of processing of the data and can therefore not only define the uncertainty for the final data product and compare this against any validation effort, but also know the contributors to our overall uncertainty budget.
In Metrology institutes for any measurement there is a requirement to trace it back to some form of international standard. In many cases, we should be able to trace the uncertainty back to an original calibrated diffuser, or lamp, standard on the satellite in question for those with on-board calibration, or to a field site which has been extensively characterised where the instruments characterising the surface can be traced back to international standards. This provides a direct means to say that, for example, the TOA (Top Of Atmosphere) radiance has a specific value with a specific uncertainty associated with this value. This is a key element in global climate change, where the uncertainty on the parameters for the models must be known for accurate prediction (radiance of surfaces, water, land, cloud etc.).
SatMagazine
How would these quality control measures typically be used by your customers?
Dr. Mackin
Currently satellite operators do not give quality statements on their low level products. Value-adders do not give quality statements on their higher level products (except perhaps in the case of atmospheric sciences which seems a little more advanced than the other areas). It is impossible for an end-user or application developer in Kopernikus to say which data is most suitable for a specific application, or even the induced errors in any product produced from this data. It is difficult at this time to say there is a single quality measure. We are proposing that the quality information from each module can be aggregated in higher level modules and suitable indices designed to provide to users with different needs. The lower level information should be there, either as single or a limited number of values in the metadata or as quality products which may contain pixel by pixel quality information (if this is required).
It will require some experimentation with the end-user to determine exactly what they require. Perhaps a simple labelling such as energy-efficiency of devices for example. Or some quantitative statement of the uncertainty in any final product, such as ppm of an atmospheric gas, uncertainty of water leaving radiance, uncertainty in a DEM. This in many ways has still to be decided and until that point the user will have the possibility to drill down through the metadata and indexed quality products to examine exactly how each product satisfies the user needs.
SatMagazine
Thank you, Dr. Mackin, for your expalanation of the new calibration process. For further reader information regarding DCMii, please visit: http://www.dmcii.com/
About the author
Dr. Stephen Mackin is Chief Scientist for DMC International Imaging Ltd and a Principal Research Scientist at the UK National Physical Laboratory. His current primary research tasks are in calibration (Disaster Monitoring Constellation) and data quality. Dr.Mackin has an extensive background in Earth Observation covering applications of hyperspectral remote sensing in Geology and land degradation and examination of GPS Reflectometry applications and low cost thermal microbolometer systems for hot spot monitoring.