Geocortex Analytics dashboards allow you to monitor a collection of GIS resources all in one, easy-to-consume view. Dashboards can be personalized to contain whichever resources and infrastructure you want to monitor.
Perhaps you want to monitor the most-used resources for a set of your users? With Geocortex Analytics, you can build a dashboard that displays all the resources that are critical to the productivity of those users.
In this Geocortex Tech Tip, Derek Pettigrew (Geocortex Analytics Product Manager) shows you how to create a dashboard and populate it with the different resources you want to monitor.
Hi, my name is Derek. I’m the Geocortex Analytics Product Manager, and today we’re going to take a look at configuring dashboards [in Geocortex Analytics], so you present all the panels you’re looking for in one place.
Let’s add a dashboard by going to the top right-hand corner, where we see “my dashboards”. We select this, add a new dashboard and name it based on the theme. I’m going look at a theme for LA County services, so I’ll call it that: “LA services”. By creating the dashboard, I now have one available, but I have not yet added any content to it.
To add content, I need to go and find the panels I want to add to my dashboard. We’re looking at services performance overall, so we’ll probably want to see things that impact it, such as [the physical] server, the ArcGIS server, and maybe Geocortex Essentials as well.
Now looking through [Geocortex Analytics], I have a panel in front of me with a lot of service information on it, but it’s not just for the LA County area that I’m looking for. What I can do is apply a filter to it and type in only things for “Los Angeles”. When I apply this filer, now I can see all my services for “Los Angeles”, and this filter will be retained when I apply it to my dashboard.
I go over here, to this dashboard icon, and I can add [this panel] to the dashboard I just created for LA Services Performance. Now I have that added; I can also go through and take a look at my server that [the LA County service] resides on to see how it’s performing so I can correlate that information. I’m going down to my actual memory usage - I think this is very useful to understand how my service is performing to ensure the actual hardware can handle the demands. So, I’ll also add that to my dashboard for LA Services Performance.
Finally, I’m going to go and filter this whole menu, and take a look for my LA county application that I want to monitor. Selecting this, what I’m looking for is to see the overall traffic that’s coming in, to see how that actual service is performing under the demand of end-users. And I’ll also add that one in to my new dashboard.
Now that I’ve gone through and added numerous items to my dashboard, let’s take a look at it. This is what I’m seeing in my dashboard. I’m able to go through and have this wonderful panel just with my filtered results for “Los Angeles”. I’m able to see the server it resides on, and I’m able to go down and see the actual visitor types coming through to my applications, all in one place. And these dashboards are also exportable to PDF for sharing.
That’s how to create a [Geocortex Analytics] dashboard. Thank you for your time.
Last year we released Geocortex Workflow 5 and – for the first time – we delivered a Geocortex product as software-as-a-service (SaaS). This is the direction Esri has been moving with ArcGIS Online, and has been the industry standard for most business software applications since the early 2000s: GIS arrived a little later to the game.
For those who aren’t familiar, SaaS is a way of delivering centrally-hosted software via the Internet – instead of downloading and configuring a piece of software in your environment, you access the application in a web browser.
One of the key benefits of SaaS is the concept of continuous deployment. Because applications are centrally hosted by the provider, new software updates can be pushed to customers with a much higher frequency.
Future-proofing your investment in Esri technology has always been at the forefront of our product direction, and this is an inherent protection that comes with SaaS delivery.
Since our beta release of Geocortex Workflow 5 last August we’ve built and deployed 11 successive releases – each containing exciting new features and adding improvements – and without interruptions to our users. Customers didn’t require compatibility testing, or incur any downtime to upgrade their applications. In fact, most probably didn’t notice: they simply logged in to do their work and were running the most current version of the software.
Many of us have experienced the pain of needing dedicated GIS- or IT professional assistance to complete an upgrade. It can take the better part of a workday, and involves significant planning and testing to ensure everything works as it did before (and even then, it sometimes doesn’t).
Due to the resources required to keep large, on-premises software deployments current, many organizations avoid upgrades, only to find themselves in a sticky situation a few years down the road when they’re finally forced to do so. This is never a good situation to be in, and the ease of upgrades with SaaS helps organizations limit the technical debt that comes with delays.
Rapid deployment of new features
We all love new features; with lots of traditional solutions, however, we can end up waiting months for new features that need to be scheduled into a long release cycle. SaaS, in comparison, allows new features to be built and deployed at a rapid pace. In some cases, new features can be delivered within a matter of days.
There is no better example of this than our 2017 Geocortex Business Partner Summit, an annual event where we bring all our international partners to our Victoria Head Office for three days of learning and collaboration. We dedicated the first day to a Geocortex Workflow Hackathon, where our partners teamed up to build the best workflows they could.
Our partners built some really cools apps, and provided a bunch of great feedback to our Product team about their experiences with Geocortex Workflow 5. By the last day of the Summit (roughly 48 hours later), our Product Manager was able to show them some of the feedback that the team had already implemented in the product.
Applications that are always secure
As noted above, organizations are often saddled with technical debt from not upgrading once new versions become available. Not only can this result in more work when you’re finally forced to upgrade, it can expose your applications to security vulnerabilities.
Many of our customers use their GIS for mission-critical business applications, which makes security even more critical.
The rate at which new releases are pushed to production assures that your applications are always secure. Instead of waiting to upgrade to new versions, every time you log in you know you’ll be working with the latest version of the software that’s been tested to ensure the strongest security measures are in place.
This is the approach most SaaS providers have adopted. For example, your Google Chrome browser has likely updated every week without you noticing. Google takes security very seriously, and pushing consistent updates lets them assure that their users are always running the most secure version of the browser.
Lower cost, better uptime
SaaS applications are centrally hosted, and this ensures that your applications are always hosted in an optimal environment. When customers host their own applications, there are inconsistencies in the configuration of the hosting environments, and they may not be hosted for the best performance.
For Geocortex Workflow, we use world-class hosting services from Microsoft Azure. If anything goes wrong with the hardware in the hosting environment, teams of Microsoft engineers are standing by to fix the issue. And if something goes wrong with the software, we have a team ready to investigate and fix the issue, which can typically be achieved by pushing a new release or software patch.
The cost of hosting your applications is included in the cost you pay for the product. The hosting environment is scalable and includes protections such as failover, which are both difficult and expensive to achieve in your own environment. You don’t need your IT department to worry about managing your servers - we take care of all the hosting and associated maintenance. It’s a win-win: you free up your people for more important work, and you get a hosting environment that’s set-up in the best possible way for the product.
It’s easy to see why SaaS offerings are becoming the standard for business software solutions. Being able to offload a bulk of the costs associated with hosting your own applications can save you (and has saved many organizations) thousands of IT hours and dollars each year. The lower initial cost and the rapid delivery of new, powerful features are why we think SaaS is the direction more and more GIS tools will move towards.
We’ll still be offering on-premises solutions for situations where it makes the most sense, but SaaS is where we’re headed. This is the model Esri has adopted with ArcGIS Online, and is sure to be the model for future ArcGIS products.
If you’d like to chat about what adopting a SaaS-first approach would look like for your organization, please feel free to get in touch with us. You can also get your hands on Geocortex Workflow 5 in our Discovery Center, complete with tutorials, demonstrations, and sample applications you can try.
One thing that we’ve been hearing, though, is that developers want the ability to apply their own code in the workflows they’re building.
So, what are “activities”? In the simplest terms, they’re the building blocks of a workflow - each activity represents a unit of work. Geocortex Workflow 5 offers more than 150 pre-built activities that chain together to automate almost any task. Activities such asgeocode, query layer, set the map extent, get user info, calculate distance, buffer geometry, run geoprocessing,and so many more allow you to streamline even the most complex GIS and business tasks.
Flex your development chops and write activities to perform tasks that weren’t previously possible – or were extremely complex to assemble with pre-built activities. You can combine your programming skills with Geocortex Workflow’s intuitive, activity-based design to build powerful applications.
Custom activities can be built for yourself, or for others in your organization; even non-developers can work with activities as they would any others in Geocortex Workflow Designer. And granted that your technology of choice supports the functionality you’re building, custom activities can be consumed in Geocortex and/or Web AppBuilder for ArcGIS applications.
Take Geocortex Workflow 5 even further
While most tasks can be automated with the pre-built, out-of-the-box (OOTB) activities offered with Geocortex Workflow 5, you can now build anything you want with the SDK. Custom integrations, talking to custom web services, connecting with 3rdparty APIs, and interfacing with custom code in your existing apps are now all possible.
Here are a few examples of what you can do with custom activities:
Perhaps you want to integrate with a 3rdparty system like SAP®. While this is possible with pre-built activities, you’ll be manually assembling workflow logic to make the web requests, parse the responses, and execute your business logic. With the latest updates, you can achieve a result that’s more clean, efficient, and consumable by wrapping the logic in a few simple custom activities.
Many common tasks are time-consuming to build – maybe you find yourself using the same pattern over and over in one workflow. Instead of following this repetitive pattern, you can bundle all the logic within a single custom activity. An example might be sorting map features by multiple columns. Pre-built activities are available that will sort data by one column, but it’s more efficient to write a custom activity to sort by multiple columns than it is to link activity after activity – especially if you need to perform these tasks across multiple applications and workflows.
Set a standard
Unless your organization follows strict guidelines for building custom apps and widgets, there is always the risk that developers will use different patterns and approaches to develop custom code. This makes it difficult for others to maintain or update the code; it can be a bit like the wild west.
This can be mitigated with Geocortex Workflow 5’s custom activities. All activities have the same, simple signature of inputs, outputs, and an execute method. Following the activity-based pattern ensures you have a standard practice for building custom logic.
With activities, you are implementing a unit of work rather than a large, rigid solution. This promotes reusability and your code will be easier to write, interpret, test, and maintain. Any developer will be able to pick up your custom activities and understand how to work with them.
You can also control how custom activities are presented to other users in the browser-based Geocortex Workflow Designer. They can be configured to look like the existing OOTB activities, helping ensure a consistent pattern across your apps.
Custom activities in Web AppBuilder for ArcGIS®
At Latitude Geographics, we’ve always built complementary technology to help our customers accomplish even more with Esri’s ArcGIS platform. With Geocortex Workflow 5, we’ve taken this to a new level by allowing you to build workflows that runinsideWeb AppBuilder for ArcGIS.
If you’re using Web AppBuilder for ArcGIS, creating custom activities with Geocortex Workflow 5 is still the preferred alternative to writing a bunch of custom widgets. Initial deployment will require a similar amount of effort, but ongoing maintenance and modifications of custom activities require significantly less time (and pain!).
If you write a custom widget for Web AppBuilder for ArcGIS and want to deploy it to multiple apps, you need to edit the source code inallthe applications using that widget each time a modification is required. With Geocortex Workflow 5, the custom code is packaged in an activity, and you only need to modify the source activity for changes to be applied across all your applications.
You can access the SDK inour Documentation Center. Just look for the .zip file that contains all the necessary instructions you need to get started.
Let us know how it goes
As you get going with the new SDK, we want to hear your feedback. If you have questions, comments, or concerns, pleaseget in touch with usto let us know.
We’d also love it if you share what you’re building with us and other users in theGeocortex Workflow Community. This is a great place to connect with other users - everyone benefits from sharing tips, tricks, and sample workflows.
Our Product Development team has been hard at work for the past few months, adding functionality for GIS professionals to work more intuitively with data in tables (after all, your data is the most important aspect of your apps), and brings in editing functionality that hastypically been reserved for desktop GIS tools.
One of these features is aredesigned results table that dramatically improves performance, introduces infinite scrolling for hundreds (or even thousands) of records, and allows you to perform actions on specific results or sets of results. Geocortex Essentials users also benefit from advanced editing tools that allow you to cut and union geometries while maintaining attribute integrity.
Geocortex Essentials 4.9 (alongside Geocortex Viewer for HTML5 2.10 and Geocortex Mobile Framework 2.3.1) arenow available for download. You can learn more about the latest releases on the product release page, or download the new versions in the Geocortex Support Center.
Additional support for Geocortex Workflow 5
We’ve continued to add support forGeocortex Workflow 5to deploy offline workflows, dynamic forms, and new activities in Geocortex Essentials applications.
Geocortex Workflow 5 is our newest product that allows you to extend Geocortexand Web AppBuilder for ArcGIS ® applications by turning complex business processes into simple, guided end-user interactions. Instead of writing thousands of lines of code to meet custom requirements, choose from more than 150 pre-built activities that chain together to automate almost any task.
Building upon Geocortex Essentials’ popular workflow capabilities, Geocortex Workflow 5 is also offered as a standalone product and enables you to build workflows right in your browser.
Geocortex Workflow 5 provides the ability to take workflows offline to support field operations. You can download workflows and run them in your mobile viewers, keeping field workers productive… even in areas with no network connectivity.
For example, a municipal field engineer may need to perform fire hydrant inspections throughout a rural neighborhood. With Geocortex Workflow 5, the inspection application and corresponding workflow can be downloaded right to the worker’s tablet at the beginning of the day, and the assigned inspections can be completed without any network connection.
When the day is done and the tablet has returned to network connectivity, the field worker can sync the inspection data back to the database. This ensures that the field worker can continue to complete inspection, regardless of where they are, and that the data is readily available to the people in the office that need it.
Explore Geocortex Workflow 5
If you’d like to learn how you can take your business processes offline, or extend applications you’ve built with Web AppBuilder for ArcGIS, you can get a feel for the product in theGeocortex Workflow 5 Discovery Center. Here you’ll find demonstrations of different deployment scenarios, tutorials, and sample applications.
Or, if you’re ready to get started with Geocortex Workflow 5 you can sign up for a free developer license here.
Over the past few weeks, I’ve shared foundational knowledge about how data is stored and managed in the pipeline industry.My first postintroducedArcGIS Pipeline Referencing (APR)and explained some options operators have in adopting it.My second postworked to demystify the confusing world of pipeline data models (there’s a lot to consider).
In this post, I will outline important information you need to consider when choosing a data model for your organization, including:
Limitations of current data models;
How APR is addressing these limitations; and
Questions you should ask yourself to help assess the best data model for your organization (should you choose to move to APR).
Limitations of Existing Models
A data model is defined as: “An abstract model that organizes elements of data, and standardizes how they relate to one another and to properties of real world entities.”
In the pipeline space, real world entities include not only the pipe and valves that make up the pipeline system, but all of the things that happen to and around the pipe. Things such as repairs & replacements, surveys & patrols, one-call, cathodic protection, ILI & CIS assessments, HCA & class location, land ownership & right-of-ways, and crossing information all have components that, in one form or another, need to be captured in your GIS.
The differing needs of these complex data representations expose limitations in legacy systems. And in a world where critical business decisions must be made from this data, identifying limitations and addressing them is an important step as we move to next-generation solutions.
Limitation #1: Data volume
As the years have progressed, the operational and regulatory needs surrounding pipelines have increased. These needs are driving new levels of inspections and analyses on pipeline systems - resulting in more data, both in terms of volume and complexity. The legacy systems were simply not designed to handle the volume of data current programs produce.
An example is the case ofInline Inspection(ILI) andClose Interval Survey(CIS) data. A single ILI or CIS inspection results in hundreds of thousands of records. With assessment intervals recurring every 1-7 years -- and operators performing dozens of inspections each year -- the resulting records from these inspections alone add millions of records to the database. This doesn’t include follow-up inspections, digs, andrun comparisonactivities.
When you couple the sheer volume of records with complexities surrounding data management and the need to provide a high-performance environment, limitations in the system are quickly exposed. These limitations force operators to make difficult data storage decisions, often choosing to remove subsets of data from the system of record. This is sub-optimal to say the least; it significantly impacts your ability to view and analyze important trends in the data.
Limitation #2: Engineering Stationing
Engineering stationingis important, complex, and rooted in pipeline data management. Before modern GIS, operators kept track of the location of pipelines and associated appurtenances using engineering stationing on paper or mylar sheets. With a vast majority of pipelines that are in use being constructed before the existence of modern GIS technology, large volumes of data were referenced with this approach.
Engineering stationing doesn’t benefit all operators; however, companies that manage gathering and distribution assets find this method burdensome … and dare I say unnecessary?
When traditional data models were developed, the need to adhere to legacy engineering stationing outweighed the need to redesign the entire system to favor a spatial-first approach. But as technology has improved, and more users have embraced spatial data, new methods to blend modern (spatial-first) and legacy (stationing-first) models have emerged. Operators need this flexibility when managing their assets.
Limitation #3: Native support for the Esri Platform
The emergence of thePipeline Open Data Standard (PODS)represents the last major shift in data storage solutions for pipelines, and it happened nearly 20 years ago. At that time, the GIS landscape was both immature and fragmented. As a by-product, PODS was designed specifically to be GIS-agnostic. In the nearly two decades since, Esri has emerged as the predominant provider of spatial data management, and they have developed a suite of solutions that enable stronger collection, management, and analysis of data.
Chances are your organization embraces Esri for spatial data management and content dissemination, which begs the question: “If your organization has standardized on Esri technology, does it make sense to employ a data structure that does not natively support the environment?” (Hint: probably not.)
Addressing and Improving Limitations
The core of APR has been engineered to address important limitations currently felt due to the existing designs of PODS and APDM. APR directly addresses the three limitations described above.
Improvement #1: Data volume
Understanding the need to support large datasets, time-aware data, and the ability to offload the storage of data to other systems, APR has been engineered to handle the high volume of data more efficiently, with a focus on scalability. To achieve this, available rules can be configured to allow a more fine-grained approach to managing data during routine line maintenance. No longer are implementations limited to keeping the data referenced to the LRS or detaching it.
Changes like these allow operators to keep more data in the system, providing a baseline for more powerful analysis and decision making.
Improvement #2: Engineering Stationing
As explained above, engineering stationing is firmly rooted in pipeline data management, but it’s not required for all operators. New construction, gathering systems, and vertically-integrated distribution companies are finding the rigorous application of stationing to be unnecessary overhead. If your current database repository requires it, and your organization doesn’t rely on it, you are taking on unnecessary data management cycles - costing valuable time and money.
APR not only provides the ability to manage data in stationed and non-stationed methods: its flexibility allows for both stationed and non-stationed lines to exist in the same model. Let that sink in for a bit: Operators that have deployed two separate data management systems can now consolidate the management of these assets! This functionality benefits a majority of the clients I’ve worked with over the years.
Improvement #3: Native support for the Esri Platform
As I stated in myprevious post, APR is (possibly most importantly) integrated with the ArcGIS platform. You can perform complexlong transactionson your data, analyze it in ways that have not been possible before, keep track of product flow using theFacility Network, and get the data in the hands of your organization with methods that are integrated, fluid, and connected.
Considerations for Implementation
If you’re considering implementing ArcGIS Pipeline Referencing (APR), knowing why, and which data model to use with it is has more to do with your business than with IT -- success can be achieved with either model.
But how do you decide which one is best for your organization? Here are some questions to consider as you’re laying the foundation for your next-generation GIS implementation.
1) Business focus: What segment of the vertical are you in?
If you are a distribution company with transmission assets, the decision is pretty clear: you should chooseUtility and Pipeline Data Model (UPDM). It’s designed as a distribution-first model, allowing you to integrate the management of distribution and transmission assets in a single model.
If your company is ‘upstream’of distribution, the answer gets a bit trickier. Both models are adequate, but my vote tends to lean towards PODS for a few reasons:
Out-of-the-box PODS supports APR slightly more natively for operators without distribution assets than UPDM.
Are you a liquids operator? As UPDM is focused on gas utility and transmission, the PODS model will provide a better solution for those moving liquid products.
As an organization delivering a comprehensive model to the industry, PODS is a thriving community of operators and vendors working together to design a comprehensive model for the industry. This collection of subject matter expertise is invaluable to operators – and provides an opportunity to share your experience with like-minded individuals.
2) Existing model: What are you using now?
As you consider moving to APR, understand that it’s going to require a data migration. The existing system will need to be mapped and loaded into the new solution. If you are currently using PODS and are a gathering, midstream, or transmission company, APR with PODS is probably the best solution to implement. It’s likely that your existing data will migrate more seamlessly, and the model will make more sense to those that manage and interact with the data.
If your organization is primarily gas distribution, and you’ve implemented a PODS model for a small subset of high-pressure assets in the system you manage, consider UPDM. You can take advantage of the intended benefits and consolidate those assets into a common platform.
3) Other programs: ILI, CIS, other survey, Cathodic Protection
If your company has a strong investment in recurring inspections, PODS again rises as the preferred model, especially considering the efforts of the PODSNext Generationinitiative around how to efficiently store and process this data moving forward.
With the growing importance of importing and exporting data (due to acquisitions, divestitures, etc.), analysis, and reporting, a system that promotes standard mechanisms to exchange data becomes increasingly more important. With the work the PODS organization is putting into a data interchange standard, it again rises as the preferred model.
There isn’t just one approach, but there is abestapproach for your organization
While this change is beneficial for operators, many things need to be considered before you commit to an approach. I hope my series of posts provides some clarity for you. To stay up-to-date on the data model landscape and the tools surrounding it, I encourage you to followthe PODS associationandEsri. The work of these two organizations in the pipeline space is a great thing for our industry.
If you’d like to discuss any of these concepts further, or would like to have a conversation about which model is best for your implementation,please get in touch with me here. I, and the rest of the Energy team at Latitude, are eager to offer our years of expertise to help you.
Why do we talk about data models so much in the pipeline space? From my years interacting with operators, vendors, and regulators, I believe it comes down to a handful of reasons:
The regulatory landscape;
The need for thorough and accurate data;
Meeting demands of complex implementation architectures;
Maintaining interoperability; and
Aligning with industry standards for linear referencing.
The pipeline data model landscape can be a difficult one to navigate. Operators work with significant amounts of data and there is no shortage of models to explore.
In my last post, I introduced ArcGIS Pipeline Referencing (APR). APR is at the core of an integrated offering from Esri that incorporates linear referenced GIS (LRS) into the ArcGIS® platform. APR focuses on the “core” of the LRS, with modeling of the data on the line (events) being stored in either a Pipeline Open Data Standard Next Generation (PODS) or a Utility and Pipeline Data Model (UPDM).
This is a slight deviation from previous approaches, which has introduced some confusion. PODS and UPDM provide database models to organize your pipeline data, while APR provides a set of tools to manage and interact with it inside your GIS. Hopefully the diagram below helps explain it a bit.
With the emergence of Integrated Spatial Analysis Techniques (ISAT), pipeline operators have had methods to store data about their systems and the surrounding environment since at least the early 1990s (some methods probably pre-date that). These methods have continued to develop through the work of the PODS organization and contributors to the ArcGIS Pipeline Data Model (APDM).
Each of these have offered unique benefits to the industry, but they’ve also introduced unneeded fragmentation to the landscape. As I mentioned in my previous post, APR helps simplify this by providing consolidation.
Pipeline Open Data Standard (PODS)
PODS is the data model standard for the pipeline industry. Founded in 1998, PODS was developed to extend legacy models (ISAT), and provide a baseline for software solutions in the industry. Since that time, PODS has established itself as the industry standard.
The success of PODS is rooted in the unique nature of operators and vendors coming together to meet the storage, analysis, regulatory, and reporting needs of the industry. It is important to note that PODS is more than a data model: it’s a group of individuals coming together to discuss, evaluate, and establish the data needs for the industry. The work of the PODS organization extends well beyond how to model a pipeline in a database.
PODS exists predominately in two variations: PODS Relational and PODS Spatial. Both models share a structure and format that adheres to the PODS standards, but differ in how they’re implemented. PODS Relational leverages core relational database standards, and PODS Spatial provides a native implementation for Esri Geodatabases.
Important to note: even though PODS Relational is designed as a GIS-agnostic data model (i.e. it’s not a geodatabase), most every implementation I have worked with has vendor-developed implementation methods and toolsets that integrates with Esri’s ArcGIS platform.
ArcGIS Pipeline Data Model (APDM)
APDM is the Esri pipeline data model template for pipeline assets. As with all of the models provided by Esri, APDM is a method for operators to access a structured data model, free of charge, in full support of an Esri implementation.
The template nature of APDM differs from the standards designation of PODS by allowing any portion of the base template to be altered to meet implementation requirements. This flexibility is the single biggest deviation from the standards-driven approach of PODS. Another separation between APDM and PODS is that APDM focuses on the features that make up the pipeline network itself, and is not intended to be as encompassing as the PODS models.
With the release of the UPDM, APDM has ultimately been retired.
Utility and Pipeline Data Model (UPDM)
As mentioned, UPDM has essentially replaced APDM as the recommended Esri pipeline data model. This model provides an implementation foundation for gas and hazardous liquids industries. UPDM has been designed to work with or without the APR linear referencing component of the ArcGIS platform.
Most importantly, UPDM is the first model released that allows vertically-integrated utilities (gas distribution companies that operate regulated, high-pressure lines) to consolidate database schemas, and centralize data management to a single model.
I hope this post has helped demystify the world of pipeline data models, as there is a lot to consider and it can be difficult to understand.
Next week, I will dive into what you should consider when choosing the best pipeline data model for your operation, including the limitations of different models, how APR is addressing the limitations, and the questions you should be asking yourself.
The UNDSS provides safety and security services to 120,000 UN personnel in 117 countries, many of which work in challenging environments with increased risk to personal safety. Protecting UN personnel and enabling them to do their work is at the heart of what the UNDSS does.
To be successful, they must be coordinated, informed, and proactive. They maintain safety by analyzing threats, coordinating security in the field, supporting peacekeeping, collaborating with NGOs, and providing security at major events.
Andre Dehondt and Hwa Saup Lee of the UNDSS Crisis Management Information Support Section will present how their GIS helps them support this mission. Their systems provide critical information to decision makers that allows them to fulfill their duty of care obligations. Geocortex is currently used as the foundation for their Threat and Risk Assessment and Mobile Travel Advisory applications.
This is a presentation you won’t want to miss! The UNDSS does extremely important work, and they are a great example of how a GIS can help save lives and protect critical infrastructure.
We’re less than a month away from the Geocortex User Conference
This will be our first in-person conference in more than 10 years, and it’s sure to be a fun, informative, and inspiring few days! The conference takes place October 24 & 25 in Washington, DC. You can learn more about the 2017 Geocortex User Conference and register using the button below.
ArcGIS Pipeline Referencing(APR) is an extension toEsri’s ArcGIS platformthat provides an event-based GIS, with native support for linear referenced systems (LRS), through standard Esri tools. APR allows pipeline operators to standardize data management and take advantage of established patterns for connecting this data with other business systems.
APR addresses many challenges pipeline operators have faced over the years. From consolidating multiple editing environments to providing a single solution that caters to gathering, midstream, transmission, and distribution assets, APR is enabling companies to get more from their investment in spatial data. Coupling this extension with the ArcGIS platform provides a level of integration and visibility previously unrealized in this space.
Why does it exist?
Linear referenced GIS exists across multiple industries, including roads and highways, pipelines, rail, and water/wastewater. While there are differences in how LRS is implemented within these industries, the core principles are the same: provide linear features representing the location of the asset to the Earth (the centerline); provide a standard method to reference features on the line (the linear referencing); and locate appurtenances to the line (the events).
Through the years, proprietary implementations of LRS systems have been built across many verticals. This has resulted in a complex and fragmented approach managing linear referenced data in a GIS. APR provides consolidation to this landscape, and lays the foundation for next-generation solutions.
What are the benefits to operators?
Operators stand to be the biggest benefactors of Esri providing a native pipeline referencing solution.
Standardization: With the history of divergent data model implementations in the pipeline space, operators are limited to software that supports their underlying LRS. APR is disrupting the status quo, and establishing a new baseline, ultimately allowing you to select the most effective solution for your needs, without the restriction of compatibility to your data model implementation.
Data Governance: The tight coupling of GIS and linear referencing benefits those relying on high-quality, accurate data. With the rise in GIS, and Esri driving the adoption, many GIS technicians and analysts are already knowledgeable in how to manage spatial data with Esri technology. Standardizing on APR provides operators with access to a broader range of technical staff that are capable of managing LRS data.
Consolidation: With APR, multiple linear referenced systems can be supported in the same model. Many operators manage assets across a large supply chain. Long-haul transmission lines have established patterns relying on engineering stationing, while those overseeing gathering and distribution assets tend to forgo true engineering stationing in lieu of routes and measures. Before APR, this meant implementing multiple data models, bringing with it varied methods for managing the data. Now you can consolidate: one system, one model, and one set of editors can manage all assets with the same tools.
What is UPDM (Utility and Pipeline Data Model) and how does it relate to APR?
UPDM is an accompanying data model to APR with a focus on unifying the implementation of vertically-focused operators. While UPDM provides an effective data repository, it is not the only data model compatible with APR. For example, thePODS Associationis building theNext Generation modelin compliance with APR.
The question then becomes, what data model should you choose for your implementation? I will be following up this post with a deep dive into the many data models that are available, and hopefully I can help you make better sense of these options.
Why are we engaged in this conversation?
Latitude Geographics provides solutions to gathering, midstream, transmission, and distribution clients around the world. In the past year, we’ve seen a sharp rise in the number of operators implementing and embracing APR. Geocortex has been 100% compatible with every implementation we’ve worked on, providing immediate value for clients leveraging PODS and UPDM.
With the release of powerful linear referenced toolsets, Geocortex can further your success with APR. Please reach out email@example.com for more information.
One of the most powerful capabilities of a GIS is the ability to integrate your mapping applications with 3rd party business systems and data sources. Integrating with key business systems extends the reach and capabilities of your applications, and ensures you and your team remain efficient and effortlessly informed.
So, why should you integrate your business systems with your GIS?
Added spatial awareness
Everything is somewhere, and there are patterns in data that can only be seen when looking at it on a map. Whether you need to view important asset data, or see key business intelligence data for a geographic area, the added element of spatial context takes your data to the next level.
Streamline data access
By centralizing access to important data in your mapping applications, you’re able to break down departmental siloes and improve data flow. Many organizations have important data that is difficult to access because it’s managed by another department. By integrating everything into a centralized application, data flows between systems and departments seamlessly.
The City of Brooklyn Park has realized the benefits of an integrated approach. Historically, City staff were limited to the databases and systems in their department. By integrating their assessment, licensing, land management, permitting, and police records into one internal mapping application, City staff can now access the data they need with just a few clicks.
We’ve also seen customers across other industries benefit significantly by integrating 3rd party business systems with their GIS. For example, we have helped oil and gas pipeline operators streamline their asset integrity programs. By integrating all the required data – which lives across multiple systems – into their Geocortex applications, many have been able to reduce overhead, simplify the audit process, and protect themselves from inevitable technology change.
Leave your data where it is
In most cases, you don’t even need to replicate, move or remodel your existing data to integrate it with your GIS. Geocortex Essentials offers varying levels of integration that allow you to leave your data where it lives, work with it directly on the map, and have it update the 3rd party system dynamically. It makes it simple to gain added spatial awareness and improve data flow between departments and systems.
What kinds of systems can you integrate with Geocortex?
The flexibility and extensibility of Geocortex allows you to integrate virtually any 3rd party business system or data source you’re currently using. Geocortex customers have integrated with ERP, asset and land management, document management, business intelligence, and CRM systems.
Whatever your business need is, chances are high that Geocortex can help you solve it. With a wide array of integration methods – from simple hyperlinking to custom API integrations – there is almost no limit to what you can achieve with Geocortex Essentials integrations.
Last week, the Vancouver Police Department (VPD) released their findings from a 6-month program they undertook with us in 2016 to pilot Geocortex Crime Forecasting, a state-of-the-art machine learning system that helps predict when and where crimes may occur.
The goal of VPD’s pilot program was to prevent break-ins and property crime throughout metro Vancouver, which typically spikes during the summer months when many people are away on vacation or leaving doors and windows open to beat the heat.
VPD focused on areas south of Broadway where residential break-ins are the most common. Special Constable Ryan Prox reported that their program reduced property crime by as much as 27% in some areas (compared to the past 4 years).
Geocortex Crime Forecasting turns simple, historic point crime data into powerful, actionable crime predictions. Predictions are quite precise; the system forecasts within a 100 m X 100 m (300’ X 300’) block, with a 2-hour time window, to determine where and when a particular type of crime might occur. By reviewing what took place in the field against predictions made for the same day, Vancouver’s Chief of Police said the system predicted with up to 80% accuracy.
With their successful pilot program complete, VPD has rolled out Geocortex Crime Forecasting to the entire police department: the first in Canada to adopt a “predictive policing” system.
The completion of the program also means that we will continue our investment in Geocortex Crime Forecasting and are excited to share it with other organizations this year!
To learn more about VPD’s program, check out the articles below: