4.8: Extending ArcGIS Enterprise
✂️ Tl;dr 🥷
The Platinum Stage outlines a structured approach to extending the eMap platform with specialised ArcGIS Enterprise server roles (e.g., Image Server, GeoEvent Server) only when justified by specific business needs. Each potential addition such as real-time data processing with GeoEvent Server or advanced imagery analysis with Image Server must undergo rigorous evaluation against existing Azure PaaS alternatives (e.g., Azure IoT Hub, Databricks) to ensure cost effectiveness and alignment with cloud-native principles. Approved roles follow a structured implementation process: validating requirements, pilot testing in non-production environments, developing infrastructure-as-code automation, integrating high availability/disaster recovery frameworks, and comprehensive user training. This incremental, requirements driven strategy prioritises architectural consistency, avoids unnecessary complexity, and ensures new components deliver tangible value while adhering to the platform’s security, automation, and governance standards.
This section outlines the strategy for potentially extending the new eMap platform with specialised ArcGIS Enterprise server roles. This extension should be based on specific business requirements and involves a methodical approach to integrate into the established architecture, ensuring consistency with automation, High Availability (HA) and Disaster Recovery (DR) frameworks.
4.8.1 Objectives: Incrementally Add Specialised Server Roles Based on Business Need¶
The primary objective of the Platinum Stage is to provide a structured pathway for extending the base ArcGIS Enterprise deployment with specialised server roles based on the following principles:
- Business-Driven Extension: The introduction of any specialised server role must be justified by a clear, validated business need and deliver demonstrable value. It is not an automatic progression but an optional extension based on specific requirements.
- Incremental Implementation: Specialised roles should be added incrementally, one at a time, rather than attempting a simultaneous deployment of multiple advanced capabilities. This allows for focused effort, risk mitigation and iterative learning.
- Architectural Consistency: New components must be integrated into the existing eMap architecture, adhering to the established principles of Azure PaaS utilisation, Infrastructure as Code (OpenTofu), Configuration Management and Zero Trust security.
- HA/DR Integration: Any specialised server role deployed in the Production environment must be incorporated into the High Availability (HA) and Disaster Recovery (DR) frameworks established in the Silver and Gold stages.
- Automation Alignment: The deployment, configuration and management of specialised roles must be automated using OpenTofu and the designated Configuration Management tool, consistent with the automation strategy for the base platform.
- Governance Adherence: The addition of new server roles must comply with the overall data governance and lifecycle management policies defined for the new eMap platform.
4.8.2 ArcGIS Enterprise Specialised Server Roles¶
ArcGIS Enterprise offers a suite of specialised server roles that extend its capabilities beyond the general GIS service hosting provided by the base deployment. The decision to implement any of these roles on the new eMap platform will be based on specific, validated organisational requirements emerging after the base platform is established. Each specialised server role typically involves licensing an additional ArcGIS Server site specifically for that role and federating it with the ArcGIS Enterprise portal.
ArcGIS GIS Server (Additional General Purpose Sites)¶
-
What it Does & Provides:
- The base ArcGIS Enterprise deployment includes an ArcGIS GIS Server site acting as the crucial hosting server. This site supports the core Web GIS infrastructure and also functions as a general-purpose GIS Server for publishing services that reference data sources.
- Historically, organisations have considered deploying additional, separate ArcGIS GIS Server sites federated with the Portal. The stated rationale for such an approach was often to dedicate these sites to specific functions or workloads, aiming to isolate these from the hosting server.
- The premise was that this separation could enhance performance or manageability for specific tasks.
-
The New eMap Platform: A Unified Site Approach
- For the new eMap platform, the creation of additional, separate ArcGIS GIS Server sites for general purpose mapping, visualisation, or geoprocessing tasks is strongly discouraged. This architectural pattern is considered an archaic approach to scalability and management, introducing unnecessary complexity and overhead that the modern design of the eMap platform seeks to avoid.
- The eMap platform will operate with a single, unified ArcGIS Server site. This site, already configured as the hosting server, will also serve as the general-purpose GIS server for all standard mapping, feature and geoprocessing services.
- Scalability via VMSS: The primary mechanism for scaling this unified ArcGIS Server site to meet fluctuating demands for its services (hosting, mapping, geoprocessing) is the Azure Virtual Machine Scale Set (VMSS). The VMSS provides robust, dynamic and cost-effective horizontal scaling for the entire site, ensuring resources are available based on configured auto-scaling rules (e.g., CPU, memory metrics). This negates the need for separate sites merely to handle increased load.
- Risk Mitigation through Modern Practices: Instead of relying on architectural segregation via multiple general-purpose sites to mitigate risks (e.g., a poorly performing service impacting others), the new eMap platform will leverage:
- Rigorous Environment Isolation: Thorough testing and validation of all services in isolated DEV and UAT environments before promotion to PROD.
- Comprehensive Automated Testing: Integrated into CI/CD pipelines to catch issues early, including performance and stability tests for services.
- Robust CI/CD Pipelines: Enforcing quality gates, automated deployments and configuration consistency, significantly reducing the likelihood of problematic services reaching production or causing widespread impact.
- Advanced Monitoring and Alerting: Proactive identification and resolution of issues within the unified site.
- Avoiding Complexity and Configuration Drift: Maintaining multiple general-purpose GIS Server sites introduces significant operational burdens:
- Increased configuration management overhead (multiple config-stores, directories, federation settings).
- Higher risk of configuration drift between sites, leading to inconsistent behaviour and troubleshooting difficulties.
- More complex upgrade and patching procedures.
- By adhering to a single, unified, VMSS-scalable ArcGIS Server site for general GIS capabilities, the new eMap platform will benefit from a more streamlined, manageable, automated and cost-effective architecture, aligning with modern cloud-native principles and avoiding the pitfalls of outdated deployment patterns.
ArcGIS Image Server¶
- What it Does & Provides:
- ArcGIS Image Server extends the platform's capabilities for managing, processing, serving and analysing large collections of imagery and raster data. It provides a distributed computing and storage system for these tasks.
- Key capabilities include:
- Dynamic Image Services: Allows publishing image services (typically from mosaic datasets) that perform on-the-fly processing. This includes operations such as orthorectification, mosaicking, band combinations and applying raster functions as imagery is requested by clients, to avoids pre-processing and storing multiple static products.
- Raster Analytics: Enables distributed processing for computationally intensive raster analysis tasks on large datasets (e.g., classification, suitability modelling, hydrologic analysis), which can speed up the generation of derived information products. Raster analytics is often deployed as a separate Image Server site from dynamic image services to prevent resource contention.
- Ortho Mapping: Provides server-side photogrammetric processing of satellite, aerial, or drone imagery to generate digital elevation models (DEMs) and orthomosaics, available via ArcGIS Pro or the ArcGIS Ortho Maker web application.
- Imagery Hosting: Allows users within the organisation to upload imagery to ArcGIS Enterprise, which is then served as dynamic imagery layers accessible for various applications and for use in raster analytics.
- Supports Deep Learning Studio, a web application for managing deep learning workflows when Image Server is configured for raster analytics. This requires GPU-enabled VMs for optimal performance.
- Image services can reference imagery stored on-premises or in cloud storage (e.g., Azure Blob Storage, Azure Data Lake Storage Gen2, as planned in Section 4.4.1).
-
Usefulness for the New eMap Platform:
- The new eMap platform is designed to utilise Azure Data Lake Storage Gen2 (ADLS Gen2) as its primary Raster Store, leveraging cloud-optimised formats such as Cloud Raster Format (CRF) and Meta Raster Format (MRF). ArcGIS Image Server could offer certain server-side capabilities to complement this.
-
Its potential contributions include:
- Dynamic Image Services: ArcGIS Image Server's primary strength lies in its ability to serve dynamic, processed imagery (e.g., applying on-the-fly mosaicking, band combinations, raster functions) directly from sources such as the ADLS Gen2 Raster Store. This can reduce the need to pre-process and store numerous static imagery products and can also provide OGC-compliant endpoints (WMS, WCS, WMTS).
- GIS-Centric Raster Analytics: It offers tools for distributed processing of large raster datasets for tasks such as land cover classification or environmental modelling, integrated within the Esri ecosystem.
- Integrated Ortho Mapping: Through Ortho Maker, it provides a streamlined, server-side workflow for generating orthoimages and DEMs from raw sensor data.
- Simplified Imagery Hosting & Deep Learning Interface: It allows users to upload imagery and provides a GIS-centric interface (Deep Learning Studio) for deep learning tasks (requires GPU-enabled VMs).
-
However, evaluation against existing Azure PaaS capabilities is essential before committing to ArcGIS Image Server. IRD has access to powerful Azure services such as **Databricks, Azure Machine Learning, Azure Batch and Azure Functions, which can address many imagery processing and analytics needs:
- Large-Scale Raster Analytics & Batch Processing: For computationally intensive raster analysis (e.g., classification, large-area change detection, suitability modelling), Azure Databricks (using Spark with libraries such as Apache Sedona/GeoSpark or custom GDAL/Rasterio jobs) often provides superior scalability, performance and flexibility when operating on data in ADLS Gen2. Azure Batch offers another avenue for parallel custom processing.
- Deep Learning and MLOps: For robust, end-to-end deep learning workflows (training, deployment, management), Azure Machine Learning offers a more comprehensive MLOps platform with better integration into the broader Azure AI ecosystem and more advanced GPU management capabilities.
- Custom Image Serving & Processing: While Azure lacks a direct PaaS equivalent to Image Server's full dynamic serving capabilities, simpler serving needs (e.g., serving Cloud Optimized GeoTIFFs) can be achieved using Azure Functions or by deploying open-source tile servers (e.g., TiTiler on App Service). This requires custom development but can be scalable and cost-effective solution for specific use cases.
- Ortho Mapping: Creating a full ortho mapping pipeline would involve deploying and managing specialised photogrammetry software. ArcGIS Ortho Maker is a compelling solution if Ortho Mapping is required.
-
Decision Factors for the new eMap Platform:
- Requirement for Dynamic, On-the-Fly Services: If there's a strong, validated business need for serving imagery with complex, real-time server-side processing, ArcGIS Image Server presents a compelling, out-of-the-box solution.
- Cost and Operational Overhead: Compare the licencing, VM infrastructure (potentially GPU) and management overhead of Image Server against the consumption-based costs and managed nature of Azure PaaS for specific tasks.
- Integration Complexity: Evaluate the effort to integrate custom PaaS solutions versus using Image Server's built-in capabilities.
-
The deployment of ArcGIS Image Server should be considered if its unique dynamic serving capabilities or tightly integrated, GIS-centric workflows for ortho mapping or specific types of raster analysis offer a clear advantage that cannot be efficiently or cost-effectively replicated by extending the existing Azure PaaS analytics and compute services. For many large-scale analytics and deep learning tasks, a direct Azure PaaS approach may be more aligned with cloud-native and scalable design principles.
ArcGIS GeoEvent Server¶
- What it Does & Provides:
- ArcGIS GeoEvent Server enables real-time ingestion, processing and dissemination of event-based data streams.
- It connects to a wide variety of streaming data sources, such as IoT devices, sensors, GPS feeds from vehicles or mobile devices and social media.
- Allows for real-time filtering, processing and spatial analysis of these events (e.g., geofencing, proximity alerts, attribute calculations).
- Can trigger alerts, send notifications, or update GIS data (e.g., feature services) dynamically based on defined conditions.
- For archiving high-velocity and large-volume event data (e.g., >200 events/second), it can integrate with an ArcGIS Data Store configured as a spatiotemporal big data store. For lower event volumes, the relational data store is sufficient.
- GeoEvent Servers are typically configured without an ArcGIS Web Adaptor, as clients (e.g., web map users) interact with the outputs (e.g., stream services, updated feature services) rather than with GeoEvent Server itself.
- The number and configuration of GeoEvent Server instances depends on the event velocity, data size per event and complexity of real-time processing. A single instance can process from 2,000 to 6,000 events per second depending on these factors.
- The recommended deployment often involves dedicating separate VMs for GeoEvent Server to avoid resource contention with other ArcGIS Enterprise components.
- Usefulness for the New eMap Platform:
- For specific scenarios involving real-time spatial data feeds (e.g., vehicle tracking for emergency management, dynamic updates to asset locations, environmental sensor networks requiring immediate spatial context for alerts), GeoEvent Server could potentially offer a GIS-centric solution.
- If deployed, its value would be in enabling the new eMap platform to:
- Display rapidly changing information directly on maps and dashboards for enhanced, real-time situational awareness.
- Perform immediate, spatially-aware processing (e.g., geofencing, proximity analysis) on incoming event streams.
- Automate notifications and trigger responses based on real-time spatial conditions, potentially improving operational responsiveness.
- However, it IRD has existing investments and access to powerful, highly scalable and deeply integrated Azure PaaS services for IoT and real-time data processing, most notably Azure IoT Hub, Azure Stream Analytics, Azure Functions and Azure Event Hubs. These services provide robust capabilities for device connectivity, data ingestion at scale, complex event processing, stream analytics and triggering downstream actions, often with advantages in terms of native Azure integration, cost-effectiveness (pay-as-you-go) and reduced infrastructure management overhead compared to a VM-based solution such as GeoEvent Server.
- Therefore, before any decision to implement ArcGIS GeoEvent Server, a rigorous comparative evaluation against leveraging the existing Azure PaaS IoT/streaming services MUST be conducted. This evaluation should assess:
- Nature of Spatial Processing: Are the real-time spatial processing requirements highly specialised and best addressed by GeoEvent Server's out-of-the-box GIS tools, or can they be effectively implemented using Azure Stream Analytics (with its geospatial functions), the existing Azure Database for PostgreSQL with PostGIS enabled, or custom logic in Azure Functions, potentially leveraging the ArcGIS API for Python for specific GIS interactions?
- Data Volume, Velocity and Variety: Assess the expected scale of real-time data. Azure PaaS services are designed for hyper-scale, while GeoEvent Server's capacity is tied to its underlying VM resources and licencing.
- Integration Requirements: How critical is direct, low-latency integration with ArcGIS feature services versus broader integration with other Azure services (data lakes, Power BI, machine learning services)?
- Total Cost of Ownership (TCO): Compare the licensing, Azure VM infrastructure and operational management costs of GeoEvent Server against the consumption-based pricing and managed-service benefits of Azure PaaS.
- Skillset and Operational Burden: Evaluate the team's expertise in managing and developing for GeoEvent Server versus Azure PaaS streaming/IoT services.
- Scalability and Elasticity: Azure PaaS services offer superior elasticity and automated scaling compared to the more manual scaling or VMSS-based scaling of a GeoEvent Server deployment.
- ArcGIS GeoEvent Server should only be considered if this evaluation clearly demonstrates a compelling, unique advantage for specific, critical real-time geospatial use cases that cannot be met as effectively, efficiently, or economically by the existing Azure PaaS capabilities. In many common real-time scenarios, augmenting Azure IoT Hub pipelines with Azure Stream Analytics or Azure Functions for spatial logic may prove to be a more modern, scalable and cost-effective approach, aligning better with cloud-native principles.
ArcGIS Notebook Server¶
- What it Does & Provides:
- ArcGIS Notebook Server integrates a Python-based data science platform in the ArcGIS Enterprise portal.
- It hosts and runs ArcGIS Notebooks, which provide an interactive, web-based environment for coding.
- Users can perform spatial analysis, craft data science and machine learning (ML) workflows, manage GIS data and content and automate administrative tasks using Python.
- It leverages Esri's Python libraries: ArcGIS API for Python (for interacting with the Web GIS platform) and ArcPy (for desktop-level geoprocessing and data management tasks).
- Architecturally, ArcGIS Notebook Server uses containeris (Docker and Mirantis Container Runtime are both supported) to provide isolated and dedicated environments for each notebook author. This ensures that resource usage (CPU, memory) by one user does not impact others. Docker (or Mirantis) must be installed and configured on the Notebook Server VMs.
- Each ArcGIS Notebook Server site is federated with the portal and typically configured with an ArcGIS Web Adaptor for access.
- Supports horizontal scaling, allowing distribution of notebook execution across several VMs.
- Provides different runtimes (pre-configured Python environments within the containers):
- Standard Runtime: Includes ArcGIS API for Python and common third-party Python libraries.
- Advanced Runtime: Includes everything in Standard, plus ArcPy and related libraries. Access to the Advanced runtime requires specific user privileges in the portal and should be pre-defined and restricted.
- Usefulness for the New eMap Platform:
- ArcGIS Notebook Server has the potential to provide a powerful and scalable environment for GIS analysts, data scientists, developers and administrators within the organisation. Its key strength lies in its native integration with ArcGIS Enterprise, offering access to portal items, services, the security model and crucially, ArcPy for geoprocessing and data management tasks.
- If adopted, it could enable advanced spatial data analysis, the development of custom data science and ML models leveraging the organisation's geospatial data, facilitate automation of GIS-centric workflows and support exploratory data analysis.
- However, it is crucial to recognise that IRD also has access to robust, enterprise-grade notebook and data science platforms within its existing Azure and Databricks environments. These platforms (e.g., Azure Machine Learning Notebooks, Azure Synapse Analytics Notebooks, Databricks Notebooks) offer comprehensive data science capabilities, extensive Python library support (including the ArcGIS API for Python for interacting with ArcGIS Enterprise services), strong integration with the broader Azure data ecosystem (Azure Data Lake Storage, Azure Blob Storage, etc.) and mature, highly scalable compute options. Nearly all these platforms are based on Apache Spark for large-scale data processing.
- Before considering the deployment of ArcGIS Notebook Server, a robust comparative evaluation MUST be undertaken. This evaluation should assess:
- Specific Use Cases and User Profiles: Identify the primary users and the nature of their work. Are they predominantly GIS professionals requiring deep ArcPy integration for established workflows, or are they data scientists/engineers who might be more productive in Spark-based or Databricks environments?
- Data Proximity and Processing Needs: Where does the data primarily reside? If massive datasets are already in ADLS Gen2 and require Spark-based processing, Databricks or Azure Synapse might be more efficient. If workflows are tightly coupled with geodatabase operations or complex ArcPy scripts, the ArcGIS Notebook Server might offer advantages.
- Integration with Existing Analytics Ecosystems: How will the notebook environment integrate with existing data pipelines, MLOps practices and other analytical tools in use within Azure or Databricks?
- Scalability Requirements: While ArcGIS Notebook Server offers containerised environments and can scale across multiple VMs, its scalability paradigm and resource management (including Docker/Mirantis administration) pales in comparison to the the elastic scaling capabilities of PaaS Azure compute and Databricks clusters.
- Operational Overhead: Assess the effort required to manage the ArcGIS Notebook Server infrastructure (including patching, Docker updates, runtime management) versus the managed service nature of Azure/Databricks notebook offerings.
- Strategic Alignment: Which platform best aligns with the organisation's broader data, analytics and cloud strategy?
- Only after a comprehensive evaluation demonstrates a clear and unique advantage for ArcGIS Notebook Server for specific, critical use cases should its deployment be considered.
ArcGIS Workflow Manager Server¶
- What it Does & Provides:
- ArcGIS Workflow Manager Server is a system for automating and managing GIS and non-GIS workflows.
- It provides tools to define, execute, monitor and track multi-step business processes.
- Includes a web application for users and administrators to interact with jobs, define workflows and monitor progress.
- An ArcGIS Workflow Manager Server is federated with the portal and configured with an ArcGIS Web Adaptor. It can be installed on its own VM or co-located, depending on load.
- The ArcGIS Workflow Manager Server Advanced role (an optional add-on) extends functionality with features such as:
- Scheduling of individual steps or entire jobs (e.g., for recurring tasks).
- Webhook integrations (e.g., automatically creating jobs from ArcGIS Survey123 submissions or other external systems).
- Automated web request execution as part of a workflow.
- Extended run times for long-running geoprocessing or data quality steps.
- Key components include: Workflow Items (for organising different types of work), Step Templates (pre-configured actions), Workflow Diagrams (visual models of processes), Job Templates (pre-defined job properties) and Jobs (individual instances of work).
-
Usefulness for the New eMap Platform:
- ArcGIS Workflow Manager Server primary value proposition lies in its out-of-the-box integration with the Esri ecosystem, providing pre-built step templates for common GIS operations (e.g., geoprocessing service execution, map interaction) and a user interface familiar to GIS professionals.
- If adopted, it could assist in:
- Standardising complex GIS-specific operations (data QA/QC, map production).
- Automating sequences of ArcGIS-related tasks.
- However, IRD's Azure-native foundation and the availability of mature, powerful and often more flexible workflow and orchestration tools necessitate a rigorous comparison before committing to ArcGIS Workflow Manager Server. Many of its general workflow capabilities are often surpassed by existing Azure PaaS offerings or widely adopted open-source solutions:
-
Potential Alternative Solutions:
-
Azure Logic Apps:
- Serverless, consumption-based PaaS. Excels at system integration with a library of connectors (including generic HTTP for ArcGIS REST APIs). Visual designer is intuitive for integration flows. Can orchestrate Azure Functions for custom GIS logic (e.g., wrapping ArcGIS API for Python). Better for event-driven workflows and broader Azure ecosystem integration. Less inherently "GIS-aware" without custom components.
- For system integrations and automated sequences triggered by Azure events, Logic Apps is likely more aligned and cost-effective. ArcGIS WFM might seem easier for purely Esri-centric tasks but adds infrastructure overhead.
-
Azure Durable Functions:
- Serverless, code-first, stateful workflow orchestration. Offers immense flexibility for complex logic, long-running processes, error handling and scalability. Ideal for developers. Lacks a built-in UI for workflow design or end-user task management (this would need to be custom-built or integrated with other services).
- For backend, complex, stateful orchestrations, especially those requiring custom code and fine-grained control, Durable Functions offer a powerful, scalable and potentially more cost-effective solution. The GIS-specific steps of WFM would need to be implemented as individual functions.
-
Apache Airflow:
- Primarily a data pipeline orchestrator (ETL/ELT), excellent for scheduling, managing dependencies and monitoring batch jobs defined as Python DAGs. Can execute any script or API call, thus capable of running ArcGIS geoprocessing or automation scripts. UI is for DAG monitoring, not general human task lists.
- If the primary need is to schedule and orchestrate sequences of automated geoprocessing or data transformation tasks (especially those involving Python), Airflow is a very robust and widely adopted solution. However it is less suited for ad-hoc, human-driven workflows that WFM targets.
-
Prefect:
- A modern dataflow automation platform, also Python-based, often seen as an alternative to Airflow. Emphasises dynamic workflows, easier local development and a user-friendly UI for pipeline monitoring.
- Similar to Airflow, Prefect is excellent for Python-based data and geoprocessing pipeline orchestration. If the workflows are primarily automated data operations, Prefect offers a strong alternative. Not designed as a general-purpose human task management system.
-
Node-RED:
- Flow-based, event-driven programming tool. Excellent for rapid prototyping of IoT integrations, API mashups and simple event-driven automations. Visual interface for wiring "nodes" together.
- Node-RED could be useful for simple, event-triggered GIS automations (e.g., "when a file arrives in Blob, call a geoprocessing service"). It is not designed for managing complex, multi-step, stateful business processes with human task assignments and detailed job tracking in the way WFM or more robust BPM engines are.
-
-
Conclusion and Recommendation for the eMap Platform:
- While ArcGIS Workflow Manager Server offers specific conveniences for Esri-centric workflows, its benefits must be weighed against the increased infrastructure footprint, potential licencing costs and potential overlap with more general-purpose, cloud-native and potentially more cost-effective tools already available or strategically aligned with cloud-native principles.
ArcGIS Knowledge Server¶
- What it Does & Provides:
- ArcGIS Knowledge Server enables the creation, exploration, analysis and management of knowledge graphs within ArcGIS Enterprise.
- Knowledge graphs model real-world entities (e.g., people, organisations, locations, events, documents) and the complex relationships between them, allowing for sophisticated link analysis and the discovery of non-obvious connections and patterns.
- Requires an ArcGIS Data Store configured as an object store to persist the knowledge graph data. Alternatively, for advanced use cases or existing graph databases, it can connect to user-managed NoSQL databases such as Neo4j or ArangoDB.
- Users interact with knowledge graphs primarily through ArcGIS Pro, using it to build the data model (entity types, relationship types, properties), load data, conduct investigations and visualise results in link charts and maps.
- Queries against the knowledge graph are typically written using openCypher.
- Supports storing temporal data and using spatial operators within queries.
- An ArcGIS Knowledge Server is federated with the Portal.
-
Usefulness for the New eMap Platform:
- For scenarios where there is a need to analyse complex, interconnected datasets to understand intricate relationships, networks and influences (e.g., infrastructure interdependencies, supply chain analysis, organisational structures, investigative analysis, fraud detection), a knowledge graph solution could provide significant value. ArcGIS Knowledge Server is one such solution within the Esri ecosystem that aims to address these needs.
- A knowledge graph capability would:
- Enable users to visualise, query and analyse complex networks and the relationships between diverse entities.
- Facilitate the combination and analysis of spatial data with non-spatial relational data within a unified graph context.
- Help uncover hidden patterns, dependencies and insights that are not readily apparent through traditional GIS or relational database analysis techniques.
-
Consideration of Web-Based Alternatives:
- This is a rapidly evolving space with various commercial and open source solutions. Given the preference for web-based interfaces, several alternative stacks could provide comparable or even more suitable functionality.
- Examples of such alternative solutions include:
- Azure Cosmos DB paired with graph visualisation platforms such as Linkurious can provide a dedicated web UI for investigation and analysis.
- Managed Neo4j offerings (e.g., Neo4j AuraDB on Azure) coupled with Neo4j Bloom, Neo4j's own web-based graph exploration and visualisation application.
- Knowledge graph platforms such as Metaphactory, which have more of a semantic/RDF focus but do support property graphs and provide web based interfaces for building and interacting with knowledge graphs.
-
Recommendation for the new eMap Platform:
- While ArcGIS Knowledge Server provides an integrated solution within the Esri stack, a thorough requirements gathering process and a comparative evaluation of alternative knowledge graph solutions is essential before its adoption.
- This evaluation should consider factors such as:
- The specific analytical and visualisation features required by end users.
- Ease of integration with existing data sources.
- Scalability of the graph database backend (Azure Cosmos DB and Neo4j AuraDB are PaaS).
- The user experience and learning curve associated with the web interface of such solutions.
- ArcGIS Knowledge Server should only be adopted if it demonstrably offers the best fit for the specific knowledge graph requirements.
ArcGIS Video Server¶
- What it Does & Provides:
- ArcGIS Video Server allows for the indexing, searching, publishing and streaming of video content as web enabled video services, enriched with geospatial and temporal context.
- It supports both on-demand video files (e.g., MP4) and live video streams (e.g., UDP, RTSP, RTMP protocols).
- Can dynamically georeference video content if metadata is available, allowing video footprints and sensor positions to be displayed on a map.
- Offers transcoding support for optimal video streaming to various devices and bandwidth conditions.
- Includes a Frameset API for exporting individual video frames at specific times or intervals for further analysis or reporting.
- An ArcGIS Video Server is federated with the Portal and configured with an ArcGIS Web Adaptor.
- Esri recommends an NVIDIA GPU environment (supporting NVENC for encoding and NVDEC for decoding) for optimal performance, especially for publishing, transcoding and concurrent streaming.
- Video Server supports High Availability (HA) configurations and horizontal scaling.
- Video Server supports
webgisdr
for backup and restore or can use an Object Store (e.g., Azure Blob Storage) for its configuration store.
- Usefulness for the New eMap Platform:
- ArcGIS Video Server would enable integrating video streams that have a spatial component (e.g., fire tower cameras, drone footage, security camera feeds, other environmental monitoring videos) into the new eMap platform.
- This would enable users to:
- Visualise video feeds directly on maps, synchronised with geospatial telemetry (e.g., sensor location, view footprint).
- Search and discover video content based on geographic location, time and metadata.
- Analyse video content in conjunction with other GIS data layers for rich context and insights.
4.8.3 Implementation Considerations¶
The introduction of any new specialised ArcGIS Enterprise server role requires a structured process to ensure successful integration and operational readiness.
flowchart TB
subgraph ProcessFlow["⚙️ Process Flow"]
direction TB
A["🎯 1. Business Case Validation"]
B{"❓ 2. Approved?"}
C["🔬 3. Pilot Deployment & Testing"]
D["🛠️ 4. Develop/Update Automation"]
E["🔄 5. HA/DR Integration</small>"]
F["📊 6. Performance Testing"]
G["🚀 7. Production Deployment"]
H["👩🏫 8. User Training"]
I["🏁 9. Handover"]
end
subgraph Outcomes["📤 Outcome"]
direction TB
Z["⛔️ Halt/Re-evaluate<br><small>Business case refinement</small>"]
J["✅ Specialised Role Operational"]
end
A --> B
B -- "✅ Approved" --> C
B -- "❌ Rejected" --> Z
C --> D
D --> E
E --> F
F --> G
G --> H
H --> I
I --> J
classDef process fill:#e6ffe6,stroke:#008000,stroke-width:2px
classDef decision fill:#ffe6cc,stroke:#d79b00,stroke-width:2px
classDef outcome fill:#e3f2fd,stroke:#0b5ed7,stroke-width:2px
classDef cluster fill:#f5f5f5,stroke:#666,stroke-width:1px,stroke-dasharray:5 5
class A,C,D,E,F,G,H,I process
class B decision
class Z,J outcome
class ProcessFlow,Outcomes cluster
Diagram: General process for implementing a new specialised ArcGIS Server role.
-
Business Case Validation and Requirements Gathering:
- Before any technical work commences, a strong business case must be established, clearly defining the need for the specialised role, the expected benefits and key stakeholders.
- Detailed functional and non-functional requirements must be documented. This includes specific use cases, data sources, integration points, performance expectations, data volume, security considerations, user numbers and any specific HA/DR requirements for the component.
-
Pilot Deployment and Testing (DEV/UAT Environments):
- The new server role should first be deployed and configured in the Development (DEV) environment, followed by the UAT environment. It should utilise existing automation frameworks (OpenTofu, Configuration Management).
- This allows for initial functional testing, validation of installation and configuration procedures and early identification of any integration challenges with existing components or data sources.
- Performance characteristics can be initially assessed in UAT to inform PROD sizing.
-
Automation Development (IaC & CM):
- New OpenTofu modules or updates to existing modules will be required to provision the Azure infrastructure specific to the specialised role. This may include: * Dedicated Azure VMs or VM Scale Sets, potentially with specific SKUs (e.g., GPU-enabled instances for Video Server or Image Server deep learning). * Specific storage configurations (e.g., for GeoEvent Server's spatiotemporal big data store). * Networking adjustments (e.g., NSG rules for new communication ports required by the role, VNet integration). * Any required Azure PaaS integrations (e.g., dedicated ArcGIS Data Store instances for graph or spatiotemporal types). These OpenTofu modules must be parameterised for consistent deployment across DEV, UAT and PROD environments.
- Scripts for the designated Configuration Management tool must be developed or updated to automate the silent installation, licence provisioning, hardening and core configuration of the specialised server role software on the VMs. This includes managing software dependencies and OS-level configurations (e.g., Docker setup for Notebook Server).
-
High Availability (HA) and Disaster Recovery (DR) Integration:
- The HA and DR strategy for the new component must be defined and aligned with the existing frameworks.
- HA: This may involve deploying the component in an HA configuration in Melbourne, such as an active-passive setup or multiple active instances behind a load balancer, leveraging Azure Availability Sets (or Availability Zones). The specifics will depend on Esri's supported HA patterns for each role.
- DR: The strategy for DR to the Sydney region must be determined. This could involve: * Replicating associated data stores (e.g., spatiotemporal or graph data stores). * Re-deploying the component in Sydney using IaC/CM during a DR event from a "pilot light" state. * Including its state in
webgisdr
backups if applicable and supported by Esri for that role. - Relevant DR runbooks must be updated.
-
Performance Testing (UAT/Staging Environment):
- Once installed and HA/DR plans are in place, performance and load testing should be conducted. This is ideally done in a dedicated staging environment that mirrors PROD.
- This testing ensures the component meets its performance requirements under realistic load conditions and helps identify any bottlenecks or tuning opportunities.
-
Production Deployment:
- Following successful testing and validation in UAT/Staging, the new specialised server role can be deployed to the PROD using established IaC and CM automation scripts.
- Monitoring and alerting specific of the new component must be configured in Azure Monitor. This includes collecting relevant logs and performance metrics and defining alert rules for critical operational conditions.
-
User Training and Operational Handover:
- Targeted training must be provided to end-users who will interact with the new capabilities (e.g., data scientists for Notebook Server).
- Training and documentation should be provided to GIS Engineers and operations staff.
- Operational documentation for common tasks, troubleshooting guides and backup/recovery procedures specific to the new component must be updated and formally handed over to the support desk.