The satellite industry could be said to be facing three challenges: the extreme environment in which satellites operate, the cost of getting the satellites into orbit and the cost of failing to get the satellites into orbit — these three challenges make the data that satellites collect exceptionally valuable.
With the major advances in technology over the last 30 years, satellites, or rather satellite companies, have become more adept at collecting data. This has created a situation where every launch is generating petabits of data, all of which needs to be sorted, stored and, most importantly, converted into information, from which satellite companies can learn to advance their projects.
This data can have an impact well beyond pure scientific research. For example, if a launch fails or a satellite does not achieve required orbit, the point at which the error occurs has significant ramifications for the insurance policy, the level of claim and determining with whom the liability lies.
In the event of a failure, having reliable data also means that the project team can pin-point and remedy an issue, and if they can do that quickly, with a degree of certainty, and explain where a problem arose, they are more likely to be able to approach their backers with confidence, and more likely to receive a positive reception.
Either way, having reliable data which can be analyzed relatively easily can help to ensure that an issue is resolved quickly and the project returns to the launch-pad as quickly as feasible.
What Data is Usually Collected?
The type of data that satellites collect varies from project to project and can be highly confidential. While there probably is some overlap between different companies and different satellites, the data gathered by each company can be very specific and proprietary.
What is true for all satellites is that data is received at very high rates through multiple channels. This means that it is extremely important that any database that is receiving the information be able to handle vast volumes of data, and to then analyze that data, in any number of ways, efficiently and quickly.
The metrics and data gathered provide low-level insight into the performance of the satellite system, which in turn allows satellite companies to troubleshoot and fine-tune the network to optimize the service provided to their end users.
As the number and importance of smallsat constellations rise, ensuring that service issues in a single unit can be isolated and easily resolved means that data needs to be analyzed efficiently and effectively to minimize the risk of cascade failure. Given the increasing reliance of various forms of transport on satellites, any outages need to be resolved quickly and there will need to be certainty that back-up and redundant capacity are not at risk of similar problems.
Why Do Satellites Need to Handle Large Quantities of Data?
Satellite owners need to configure their satellites for optimum data transmission. Every packet from the satellite has data that needs to be analyzed.
These variables change rapidly so must be monitored and used to debug and optimize the quality of the transmission. Data capture rates of around one million metric entries per second are currently possible, a number which is rising rapidly as technology improves.
Due to the extensive testing before a satellite, or a constellation of satellites, is launched, hardware and/or software anomalies are rare. However, when they do occur, the metrics gathered are critical to detection and, ultimately, resolution. Fast resolution is, of course, critical as anomalies can lead to the inefficient use of precious satellite resources, such as bandwidth and power.
One of the key challenges is that the amount of data a satellite can produce doubles each year, meaning that the data increases by a factor of five times every five years and 10x every decade.
This makes data storage one of the key issues for the industry, now and for years to come. Data which might currently be dismissed as mundane could prove to be important in the future, when compared to data collected by future missions — all needs to be stored in a manner that’s efficient and easily accessible.
One of the primary objectives of the data analysis is to detect anomalies which would be difficult to detect without gathering, storing and crunching through highly granular data. Storage speed is also quite important, particularly given the amount of data that is being created. Retrieval and analysis of the data is needed to debug and tune the feed, although this might not be done as frequently.
What are the Risks?
The importance and analysis cannot be underestimated. Satellite data is in the process of becoming more and more vital over the next few years for transportation, communications and a myriad other tools and services on which our world relies. As our reliance on satellite services increases, the risk of disruption needs to be minimized.
While the potential for a direct repeat of the Carrington Event in 1859 has been basically negated through massive improvements in terms of technology and knowledge, there is still a great deal that is not understood. The hostility of the environment in which satellites operate, coupled with the potential for unexpected events, means that being able to access and analyze data in real-time as well as historical data inclusion is likely to be extremely important, especially in the event of an unexpected occurrence.
The associated financial risks are also significant and companies that can show they have taken the unexpected into account as far as possible are likely to be the ones that will attract private- and public-sector support.
What Role do Third-Party Technology Providers Play?
There’s been a fascinating evolution in the information technology sector. A decade ago, large firms would focus their resources on developing technology services in-house, creating what were — at that time — powerful IT tools that focused on specific activities, whether they were financial, medical and health, energy and power or aerospace services.
There were several problems with these bespoke systems, the most obvious being a lack of flexibility. Because the systems were developed to complete highly specialized tasks, they were difficult to amend and rarely had any application beyond the specific task for which they were created. This also meant that firms were reliant on retaining their human talent to ensure that their systems could continue working.
This has changed over the last few years as, for example, the financial services industry has looked at ways of deploying their resources more efficiently. Additionally, the medical and health services industries have also come to recognize the benefits of data interaction for medical analysis.
This shift has created a generation of software providers who are far less focused on a specific industry. This means innovations developed for one industry have application across numerous verticals. This, in turn, means many industries can benefit from advances in a particular technology.
For the satellite industry, previously exclusive data analysis tools that were developed for other markets are now available and applicable for many. This could have several positive benefits when it comes to getting the best out of the plethora of data that is being generated by every project, every launch and every satellite in service.
Data has little value until analyzed to yield information and insights. The potential for cross-over between industries and the way data is analyzed is likely to be one of the most interesting developments over the next few years and could well create some exceptionally useful advances as well as efficiencies of note for many market segments.
www.mcobject.com/
Steve Graves co-founded McObject in 2001 to provide real-time embedded database technology, which makes embedded systems smarter, more reliable and more cost-effective to develop and maintain.
McObject offers real-time data management technology used across a wide range of industries and market segments, including finance, IoT and aerospace.