Astrium GEO UK Multi-Mission PDGS Facilities and Services

Page created by Tiffany Fisher
 
CONTINUE READING
Astrium GEO UK Multi-Mission PDGS Facilities and Services
Astrium GEO
    UK Multi-Mission
PDGS Facilities and Services
Astrium GEO UK-MM PDGS Facilities and
                                      Services
• Introduction
   – UK-MM-PAC ERS-1/2
   – UK-MM-PAC ENVISAT
   – Data Hosting Activities
   – Future Activities
   – Infoterra Archives
   – LTDP Preparation
   – Collaborative Activities
Astrium GEO UK-MM PDGS Facilities and
                            Services
Astrium GEO UK-MM PDGS Facilities and
                                      Services
• UK-MM-PAC ERS-1/2
   – Since 1991 for ERS-1, Astrium GEO (then NRSC Ltd)
     have performed Processing and Archiving Facilities
     for ESA. In 1994 the facility was expanded to
     support ERS-2.
      • Archiving of both HR (SAR) and LR (ATSR-1/2, ALT,
        GOME, WAVE,SCATT)
      • On-Request SAR Processing
      • Systematic ALT Processing
   – Media types
      •   Exabyte
      •   Sony DIR1000
      •   DLT
      •   STK9940B
      •   STK T10KB
Astrium GEO UK-MM PDGS Facilities and
                                      Services
• UK-MM-PAC ENVISAT
   – In 2002, UK-PAF became UK MultiMission Processing
     and Archiving Centre with the addition of ENVISAT
     operations
      •   Archiving of both HR (ASAR) and LR (AATSR, MERIS)
      •   On-Request ASAR Processing
      •   Systematic AATSR Processing
      •   On-Request MERIS Processing
      •   Systematic MERIS (MER_FRS) Processing
   – Current Archive volume ~1.2 PB (Inc. ERS)
   – Media types
      • IBM NTP
      • STK9940B
      • STK T10KB
Astrium GEO UK-MM PDGS Facilities and
                                      Services
• UK-MM-PAC Data Hosting
   – The following datasets are made available to Users
     for FTP download
      •   ATSR-1 (L1B, L2, L2P) ~10TB
      •   ATSR-2 (L1B, L2, L2P) ~15TB
      •   AATSR (L1B, L2, L2P) ~22TB
      •   ERS ALT.WAP ~1TB
      •   ERS Wind Scatterometer ~1.5TB
      •   MERIS Reduced Resolution ~45TB
   – Users granted access through the eoa-up and ats-
     merci-uk servers
   – Access via 300 Mbps link
   – Additionally, provide backup copy of GEOHAZARD
     dataset from Cloud
Astrium GEO UK-MM PDGS Facilities and
                                     Services
• UK-MM-PAC Data Hosting Continued
   – In addition to FTP Access to data, HTTP access to (A)
     ATSR and MERIS RR datasets is provided through
     two MERCI Servers
      • atsr-merci-uk.eo.esa.int
      • mer-merci-uk.eo.esa.int
• FTP downloads during Q1 2012 ~75TB

                      HMA Standards Summary Table
Astrium GEO UK-MM PDGS Facilities and
                                     Services
• Future Activities
   – SWARM APDF
      • Following the launch in Q4 2012, Astrium GEO will
        provided operational support for the SWARM Mission
        using MMFI infrastructure
   – Sentinel-1 Offline PAC
      • Astrium GEO will shortly commence site preparation
        activities for the Sentinel-1 Mission including the
        provision of an LTA facility
      • Support will also be provided for the commissioning of
        the PDGS following Launch
   – Sentinel-2 Offline PAC
      • Initial site preparation activities for the Sentinel-2
        Mission will also be undertaken
Astrium GEO UK-MM PDGS Facilities and
                                       Services
• The Sentinel LTAs will be interfaced to the PAC
  infrastructure via a set of web services.
• The LTAs have generic requirements – 30 TB/Day I/O,
  transfer rate of min 220 Mbytes/sec, 10,000 files a day and
  max file size of 200GBytes.
• The Infoterra LTA for Sentinel 1 consists of
   – Web services to realise the interface
   – Fast Network Infrastructure
   – Disk Cache to support fast IO
   – Hierarchical Global File System – to automate the archival and
     retrieval of products
   – Tape Archive
• Tape Archive to support Long Term Storage
   –   Tape archive based on i6000 LTO robotics
   –   Support for all LTO 5,6 products
   –   Scalable ( 96 Drives – 5300 Slots )
   –   LTO technology to deliver an Open format and long roadmap
Astrium GEO UK-MM PDGS Facilities and
                                    Services
• Sentinel LTA
                       StorNext Distributed
                       LAN Clients - upto 4
                     times NAS Performance                Application
                      plus resiliance against             Workstations
                      component and server               Connecting via
                              failure                     CIFS or NFS

                                                                                        StorNext
                                                                                       Management
                                                                                         Servers
                  High
                                                                                        Linux or
              Performance
                                                                                         Solaris
              SAN Clients
                 Linux,
               Windows,
              UNIX or Mac
                                Fibre

                                LAN
                                                                  Fabric
                                                                  Switch

                           File System
                          Metadata & cfg
                                                 StorNext Storage Manager
                                                Policy initiated movement of
            Shared File      High
                                                 content between disk tiers
             System       Perfomance
                                                           and tape.
                            FC Disk

                            Disk Array
                              SATA

                                                                               Tape Library
Astrium GEO UK-MM PDGS Facilities and
                                    Services
• Infoterra Archive
   – Based around a Heterogeneous Global File system,
     supporting enterprise Hitachi Storage and Quantum
     robotics (i6k) and LTO 5 Tape Storage. Allowing for
   – 150 TB online storage
   – 0.7 PB Near line storage
   – Aerial Photography ( 25cm, 12.5cm and 10cm), CIR,
     DEMs, DTM,s, Flood Data, OS Vector DataSets,LiDAR
     datasets, etc....
   – Served to Customers/Users via Public Internet, OGC
     Web services, media or ftp.
   – Also storage and distribution of national vector
     datasets as well as asset management and insurance
     applications.
Astrium GEO UK-MM PDGS Facilities and
                                      Services
• LTDP Preparations

   – The following activities are currently being undertaken

      • Reviewing Archive content against catalogue
      • Preparation for reading of all Archived media
      • Verification of previously transcribed datasets against
        transcription records
      • Identification of documentation
      • Scanning of paper documentation
      • Implemented virtualisation of Altimeter processing to
        generate ERS-1/2 ALT-WAP products
      • Further virtualisation being assessed

                         HMA Standards Summary Table
Astrium GEO UK-MM PDGS Facilities and
                            Services
     • Collaborative Activities
        – Working with ISIC – CEMS Project
            • The provision of computing
              infrastructure for access to large-
              volume EO and climate model datasets
              co-located with high performance
              computing facilities
            • Collaboration of Commercial and
              academic resources
            • Data integrity service and metadata
              infrastructure – FP7 Charm project.
            • Full vCloud virtualisation for service the
              commercial and academic user
              communities
            • User can ‘bring their own data’ to CEMS
              for inclusion in their environment and to
              develop ‘mash ups’ using EO data sets
You can also read