Build custom activities with the new SDK for Geocortex Workflow 5
One thing that we’ve been hearing, though, is that developers want the ability to apply their own code in the workflows they’re building.
So, what are “activities”? In the simplest terms, they’re the building blocks of a workflow - each activity represents a unit of work. Geocortex Workflow 5 offers more than 150 pre-built activities that chain together to automate almost any task. Activities such as geocode, query layer, set the map extent, get user info, calculate distance, buffer geometry, run geoprocessing, and so many more allow you to streamline even the most complex GIS and business tasks.
Flex your development chops and write activities to perform tasks that weren’t previously possible – or were extremely complex to assemble with pre-built activities. You can combine your programming skills with Geocortex Workflow’s intuitive, activity-based design to build powerful applications.
Custom activities can be built for yourself, or for others in your organization; even non-developers can work with activities as they would any others in Geocortex Workflow Designer. And granted that your technology of choice supports the functionality you’re building, custom activities can be consumed in Geocortex and/or Web AppBuilder for ArcGIS applications.
Take Geocortex Workflow 5 even further
While most tasks can be automated with the pre-built, out-of-the-box (OOTB) activities offered with Geocortex Workflow 5, you can now build anything you want with the SDK. Custom integrations, talking to custom web services, connecting with 3rd party APIs, and interfacing with custom code in your existing apps are now all possible.
Here are a few examples of what you can do with custom activities:
- Perhaps you want to integrate with a 3rd party system like SAP®. While this is possible with pre-built activities, you’ll be manually assembling workflow logic to make the web requests, parse the responses, and execute your business logic. With the latest updates, you can achieve a result that’s more clean, efficient, and consumable by wrapping the logic in a few simple custom activities.
- Many common tasks are time-consuming to build – maybe you find yourself using the same pattern over and over in one workflow. Instead of following this repetitive pattern, you can bundle all the logic within a single custom activity. An example might be sorting map features by multiple columns. Pre-built activities are available that will sort data by one column, but it’s more efficient to write a custom activity to sort by multiple columns than it is to link activity after activity – especially if you need to perform these tasks across multiple applications and workflows.
Set a standard
Unless your organization follows strict guidelines for building custom apps and widgets, there is always the risk that developers will use different patterns and approaches to develop custom code. This makes it difficult for others to maintain or update the code; it can be a bit like the wild west.
This can be mitigated with Geocortex Workflow 5’s custom activities. All activities have the same, simple signature of inputs, outputs, and an execute method. Following the activity-based pattern ensures you have a standard practice for building custom logic.
With activities, you are implementing a unit of work rather than a large, rigid solution. This promotes reusability and your code will be easier to write, interpret, test, and maintain. Any developer will be able to pick up your custom activities and understand how to work with them.
You can also control how custom activities are presented to other users in the browser-based Geocortex Workflow Designer. They can be configured to look like the existing OOTB activities, helping ensure a consistent pattern across your apps.
Custom activities in Web AppBuilder for ArcGIS®
At Latitude Geographics, we’ve always built complementary technology to help our customers accomplish even more with Esri’s ArcGIS platform. With Geocortex Workflow 5, we’ve taken this to a new level by allowing you to build workflows that run inside Web AppBuilder for ArcGIS.
If you’re using Web AppBuilder for ArcGIS, creating custom activities with Geocortex Workflow 5 is still the preferred alternative to writing a bunch of custom widgets. Initial deployment will require a similar amount of effort, but ongoing maintenance and modifications of custom activities require significantly less time (and pain!).
If you write a custom widget for Web AppBuilder for ArcGIS and want to deploy it to multiple apps, you need to edit the source code in all the applications using that widget each time a modification is required. With Geocortex Workflow 5, the custom code is packaged in an activity, and you only need to modify the source activity for changes to be applied across all your applications.
Learn more about deploying workflows inside Web AppBuilder for ArcGIS in the Geocortex Workflow Discovery Center.
Start building today
You can access the SDK in our Documentation Center. Just look for the .zip file that contains all the necessary instructions you need to get started.
Let us know how it goes
As you get going with the new SDK, we want to hear your feedback. If you have questions, comments, or concerns, please get in touch with us to let us know.
We’d also love it if you share what you’re building with us and other users in the Geocortex Workflow Community. This is a great place to connect with other users - everyone benefits from sharing tips, tricks, and sample workflows.
What's new in Geocortex Essentials 4.9?
Our Product Development team has been hard at work for the past few months, adding functionality for GIS professionals to work more intuitively with data in tables (after all, your data is the most important aspect of your apps), and brings in editing functionality that has typically been reserved for desktop GIS tools.
One of these features is a redesigned results table that dramatically improves performance, introduces infinite scrolling for hundreds (or even thousands) of records, and allows you to perform actions on specific results or sets of results. Geocortex Essentials users also benefit from advanced editing tools that allow you to cut and union geometries while maintaining attribute integrity.
Geocortex Essentials 4.9 (alongside Geocortex Viewer for HTML5 2.10 and Geocortex Mobile Framework 2.3.1) are now available for download. You can learn more about the latest releases on the product release page, or download the new versions in the Geocortex Support Center.
Additional support for Geocortex Workflow 5
We’ve continued to add support for Geocortex Workflow 5 to deploy offline workflows, dynamic forms, and new activities in Geocortex Essentials applications.
Geocortex Workflow 5 is our newest product that allows you to extend Geocortex and Web AppBuilder for ArcGIS ® applications by turning complex business processes into simple, guided end-user interactions. Instead of writing thousands of lines of code to meet custom requirements, choose from more than 150 pre-built activities that chain together to automate almost any task.
Building upon Geocortex Essentials’ popular workflow capabilities, Geocortex Workflow 5 is also offered as a standalone product and enables you to build workflows right in your browser.
Take Geocortex Workflow for a spin in the Discovery Center, complete with videos, tutorials, and sample apps.
Take workflows into the field
Geocortex Workflow 5 provides the ability to take workflows offline to support field operations. You can download workflows and run them in your mobile viewers, keeping field workers productive… even in areas with no network connectivity.
For example, a municipal field engineer may need to perform fire hydrant inspections throughout a rural neighborhood. With Geocortex Workflow 5, the inspection application and corresponding workflow can be downloaded right to the worker’s tablet at the beginning of the day, and the assigned inspections can be completed without any network connection.
When the day is done and the tablet has returned to network connectivity, the field worker can sync the inspection data back to the database. This ensures that the field worker can continue to complete inspection, regardless of where they are, and that the data is readily available to the people in the office that need it.
Explore Geocortex Workflow 5
If you’d like to learn how you can take your business processes offline, or extend applications you’ve built with Web AppBuilder for ArcGIS, you can get a feel for the product in the Geocortex Workflow 5 Discovery Center. Here you’ll find demonstrations of different deployment scenarios, tutorials, and sample applications.
Or, if you’re ready to get started with Geocortex Workflow 5 you can sign up for a free developer license here.
ArcGIS Pipeline Referencing: Choose the best data model
Over the past few weeks, I’ve shared foundational knowledge about how data is stored and managed in the pipeline industry. My first post introduced ArcGIS Pipeline Referencing (APR) and explained some options operators have in adopting it. My second post worked to demystify the confusing world of pipeline data models (there’s a lot to consider).
In this post, I will outline important information you need to consider when choosing a data model for your organization, including:
- Limitations of current data models;
- How APR is addressing these limitations; and
- Questions you should ask yourself to help assess the best data model for your organization (should you choose to move to APR).
Limitations of Existing Models
A data model is defined as: “An abstract model that organizes elements of data, and standardizes how they relate to one another and to properties of real world entities.”
In the pipeline space, real world entities include not only the pipe and valves that make up the pipeline system, but all of the things that happen to and around the pipe. Things such as repairs & replacements, surveys & patrols, one-call, cathodic protection, ILI & CIS assessments, HCA & class location, land ownership & right-of-ways, and crossing information all have components that, in one form or another, need to be captured in your GIS.
The differing needs of these complex data representations expose limitations in legacy systems. And in a world where critical business decisions must be made from this data, identifying limitations and addressing them is an important step as we move to next-generation solutions.
Limitation #1: Data volume
As the years have progressed, the operational and regulatory needs surrounding pipelines have increased. These needs are driving new levels of inspections and analyses on pipeline systems - resulting in more data, both in terms of volume and complexity. The legacy systems were simply not designed to handle the volume of data current programs produce.
An example is the case of Inline Inspection (ILI) and Close Interval Survey (CIS) data. A single ILI or CIS inspection results in hundreds of thousands of records. With assessment intervals recurring every 1-7 years -- and operators performing dozens of inspections each year -- the resulting records from these inspections alone add millions of records to the database. This doesn’t include follow-up inspections, digs, and run comparison activities.
When you couple the sheer volume of records with complexities surrounding data management and the need to provide a high-performance environment, limitations in the system are quickly exposed. These limitations force operators to make difficult data storage decisions, often choosing to remove subsets of data from the system of record. This is sub-optimal to say the least; it significantly impacts your ability to view and analyze important trends in the data.
Limitation #2: Engineering Stationing
Engineering stationing is important, complex, and rooted in pipeline data management. Before modern GIS, operators kept track of the location of pipelines and associated appurtenances using engineering stationing on paper or mylar sheets. With a vast majority of pipelines that are in use being constructed before the existence of modern GIS technology, large volumes of data were referenced with this approach.
Engineering stationing doesn’t benefit all operators; however, companies that manage gathering and distribution assets find this method burdensome … and dare I say unnecessary?
When traditional data models were developed, the need to adhere to legacy engineering stationing outweighed the need to redesign the entire system to favor a spatial-first approach. But as technology has improved, and more users have embraced spatial data, new methods to blend modern (spatial-first) and legacy (stationing-first) models have emerged. Operators need this flexibility when managing their assets.
Limitation #3: Native support for the Esri Platform
The emergence of the Pipeline Open Data Standard (PODS) represents the last major shift in data storage solutions for pipelines, and it happened nearly 20 years ago. At that time, the GIS landscape was both immature and fragmented. As a by-product, PODS was designed specifically to be GIS-agnostic. In the nearly two decades since, Esri has emerged as the predominant provider of spatial data management, and they have developed a suite of solutions that enable stronger collection, management, and analysis of data.
Chances are your organization embraces Esri for spatial data management and content dissemination, which begs the question: “If your organization has standardized on Esri technology, does it make sense to employ a data structure that does not natively support the environment?” (Hint: probably not.)
Addressing and Improving Limitations
The core of APR has been engineered to address important limitations currently felt due to the existing designs of PODS and APDM. APR directly addresses the three limitations described above.
Improvement #1: Data volume
Understanding the need to support large datasets, time-aware data, and the ability to offload the storage of data to other systems, APR has been engineered to handle the high volume of data more efficiently, with a focus on scalability. To achieve this, available rules can be configured to allow a more fine-grained approach to managing data during routine line maintenance. No longer are implementations limited to keeping the data referenced to the LRS or detaching it.
Changes like these allow operators to keep more data in the system, providing a baseline for more powerful analysis and decision making.
Improvement #2: Engineering Stationing
As explained above, engineering stationing is firmly rooted in pipeline data management, but it’s not required for all operators. New construction, gathering systems, and vertically-integrated distribution companies are finding the rigorous application of stationing to be unnecessary overhead. If your current database repository requires it, and your organization doesn’t rely on it, you are taking on unnecessary data management cycles - costing valuable time and money.
APR not only provides the ability to manage data in stationed and non-stationed methods: its flexibility allows for both stationed and non-stationed lines to exist in the same model. Let that sink in for a bit: Operators that have deployed two separate data management systems can now consolidate the management of these assets! This functionality benefits a majority of the clients I’ve worked with over the years.
Improvement #3: Native support for the Esri Platform
As I stated in my previous post, APR is (possibly most importantly) integrated with the ArcGIS platform. You can perform complex long transactions on your data, analyze it in ways that have not been possible before, keep track of product flow using the Facility Network, and get the data in the hands of your organization with methods that are integrated, fluid, and connected.
Considerations for Implementation
If you’re considering implementing ArcGIS Pipeline Referencing (APR), knowing why, and which data model to use with it is has more to do with your business than with IT -- success can be achieved with either model.
But how do you decide which one is best for your organization? Here are some questions to consider as you’re laying the foundation for your next-generation GIS implementation.
1) Business focus: What segment of the vertical are you in?
If you are a distribution company with transmission assets, the decision is pretty clear: you should choose Utility and Pipeline Data Model (UPDM). It’s designed as a distribution-first model, allowing you to integrate the management of distribution and transmission assets in a single model.
If your company is ‘upstream’ of distribution, the answer gets a bit trickier. Both models are adequate, but my vote tends to lean towards PODS for a few reasons:
- Out-of-the-box PODS supports APR slightly more natively for operators without distribution assets than UPDM.
- Are you a liquids operator? As UPDM is focused on gas utility and transmission, the PODS model will provide a better solution for those moving liquid products.
- As an organization delivering a comprehensive model to the industry, PODS is a thriving community of operators and vendors working together to design a comprehensive model for the industry. This collection of subject matter expertise is invaluable to operators – and provides an opportunity to share your experience with like-minded individuals.
2) Existing model: What are you using now?
As you consider moving to APR, understand that it’s going to require a data migration. The existing system will need to be mapped and loaded into the new solution. If you are currently using PODS and are a gathering, midstream, or transmission company, APR with PODS is probably the best solution to implement. It’s likely that your existing data will migrate more seamlessly, and the model will make more sense to those that manage and interact with the data.
If your organization is primarily gas distribution, and you’ve implemented a PODS model for a small subset of high-pressure assets in the system you manage, consider UPDM. You can take advantage of the intended benefits and consolidate those assets into a common platform.
3) Other programs: ILI, CIS, other survey, Cathodic Protection
If your company has a strong investment in recurring inspections, PODS again rises as the preferred model, especially considering the efforts of the PODS Next Generation initiative around how to efficiently store and process this data moving forward.
With the growing importance of importing and exporting data (due to acquisitions, divestitures, etc.), analysis, and reporting, a system that promotes standard mechanisms to exchange data becomes increasingly more important. With the work the PODS organization is putting into a data interchange standard, it again rises as the preferred model.
There isn’t just one approach, but there is a best approach for your organization
While this change is beneficial for operators, many things need to be considered before you commit to an approach. I hope my series of posts provides some clarity for you. To stay up-to-date on the data model landscape and the tools surrounding it, I encourage you to follow the PODS association and Esri. The work of these two organizations in the pipeline space is a great thing for our industry.
If you’d like to discuss any of these concepts further, or would like to have a conversation about which model is best for your implementation, please get in touch with me here. I, and the rest of the Energy team at Latitude, are eager to offer our years of expertise to help you.
Demystifying Pipeline Data Models
If you’re reading this, you’re likely involved with managing GIS pipeline data. If you aren’t, I recommend checking out an informative, relevant (and quite possibly more interesting) read on enhancing your GIS with integrations or connecting business processes to your GIS. If you’re still with me, strap in.
Why do we talk about data models so much in the pipeline space? From my years interacting with operators, vendors, and regulators, I believe it comes down to a handful of reasons:
- The regulatory landscape;
- The need for thorough and accurate data;
- Meeting demands of complex implementation architectures;
- Maintaining interoperability; and
- Aligning with industry standards for linear referencing.
The pipeline data model landscape can be a difficult one to navigate. Operators work with significant amounts of data and there is no shortage of models to explore.
In my last post, I introduced ArcGIS Pipeline Referencing (APR). APR is at the core of an integrated offering from Esri that incorporates linear referenced GIS (LRS) into the ArcGIS® platform. APR focuses on the “core” of the LRS, with modeling of the data on the line (events) being stored in either a Pipeline Open Data Standard Next Generation (PODS) or a Utility and Pipeline Data Model (UPDM).
This is a slight deviation from previous approaches, which has introduced some confusion. PODS and UPDM provide database models to organize your pipeline data, while APR provides a set of tools to manage and interact with it inside your GIS. Hopefully the diagram below helps explain it a bit.
Legacy data models and their impact
With the emergence of Integrated Spatial Analysis Techniques (ISAT), pipeline operators have had methods to store data about their systems and the surrounding environment since at least the early 1990s (some methods probably pre-date that). These methods have continued to develop through the work of the PODS organization and contributors to the ArcGIS Pipeline Data Model (APDM).
Each of these have offered unique benefits to the industry, but they’ve also introduced unneeded fragmentation to the landscape. As I mentioned in my previous post, APR helps simplify this by providing consolidation.
Pipeline Open Data Standard (PODS)
PODS is the data model standard for the pipeline industry. Founded in 1998, PODS was developed to extend legacy models (ISAT), and provide a baseline for software solutions in the industry. Since that time, PODS has established itself as the industry standard.
The success of PODS is rooted in the unique nature of operators and vendors coming together to meet the storage, analysis, regulatory, and reporting needs of the industry. It is important to note that PODS is more than a data model: it’s a group of individuals coming together to discuss, evaluate, and establish the data needs for the industry. The work of the PODS organization extends well beyond how to model a pipeline in a database.
PODS exists predominately in two variations: PODS Relational and PODS Spatial. Both models share a structure and format that adheres to the PODS standards, but differ in how they’re implemented. PODS Relational leverages core relational database standards, and PODS Spatial provides a native implementation for Esri Geodatabases.
Important to note: even though PODS Relational is designed as a GIS-agnostic data model (i.e. it’s not a geodatabase), most every implementation I have worked with has vendor-developed implementation methods and toolsets that integrates with Esri’s ArcGIS platform.
ArcGIS Pipeline Data Model (APDM)
APDM is the Esri pipeline data model template for pipeline assets. As with all of the models provided by Esri, APDM is a method for operators to access a structured data model, free of charge, in full support of an Esri implementation.
The template nature of APDM differs from the standards designation of PODS by allowing any portion of the base template to be altered to meet implementation requirements. This flexibility is the single biggest deviation from the standards-driven approach of PODS. Another separation between APDM and PODS is that APDM focuses on the features that make up the pipeline network itself, and is not intended to be as encompassing as the PODS models.
With the release of the UPDM, APDM has ultimately been retired.
Utility and Pipeline Data Model (UPDM)
As mentioned, UPDM has essentially replaced APDM as the recommended Esri pipeline data model. This model provides an implementation foundation for gas and hazardous liquids industries. UPDM has been designed to work with or without the APR linear referencing component of the ArcGIS platform.
Most importantly, UPDM is the first model released that allows vertically-integrated utilities (gas distribution companies that operate regulated, high-pressure lines) to consolidate database schemas, and centralize data management to a single model.
I hope this post has helped demystify the world of pipeline data models, as there is a lot to consider and it can be difficult to understand.
Next week, I will dive into what you should consider when choosing the best pipeline data model for your operation, including the limitations of different models, how APR is addressing the limitations, and the questions you should be asking yourself.
See the United Nations present at the 2017 Geocortex User Conference
The UNDSS provides safety and security services to 120,000 UN personnel in 117 countries, many of which work in challenging environments with increased risk to personal safety. Protecting UN personnel and enabling them to do their work is at the heart of what the UNDSS does.
To be successful, they must be coordinated, informed, and proactive. They maintain safety by analyzing threats, coordinating security in the field, supporting peacekeeping, collaborating with NGOs, and providing security at major events.
Andre Dehondt and Hwa Saup Lee of the UNDSS Crisis Management Information Support Section will present how their GIS helps them support this mission. Their systems provide critical information to decision makers that allows them to fulfill their duty of care obligations. Geocortex is currently used as the foundation for their Threat and Risk Assessment and Mobile Travel Advisory applications.
This is a presentation you won’t want to miss! The UNDSS does extremely important work, and they are a great example of how a GIS can help save lives and protect critical infrastructure.
We’re less than a month away from the Geocortex User Conference
This will be our first in-person conference in more than 10 years, and it’s sure to be a fun, informative, and inspiring few days! The conference takes place October 24 & 25 in Washington, DC. You can learn more about the 2017 Geocortex User Conference and register using the button below.
What is ArcGIS Pipeline Referencing and why should you be paying attention to it?
ArcGIS Pipeline Referencing (APR) is an extension to Esri’s ArcGIS platform that provides an event-based GIS, with native support for linear referenced systems (LRS), through standard Esri tools. APR allows pipeline operators to standardize data management and take advantage of established patterns for connecting this data with other business systems.
APR addresses many challenges pipeline operators have faced over the years. From consolidating multiple editing environments to providing a single solution that caters to gathering, midstream, transmission, and distribution assets, APR is enabling companies to get more from their investment in spatial data. Coupling this extension with the ArcGIS platform provides a level of integration and visibility previously unrealized in this space.
Why does it exist?
Linear referenced GIS exists across multiple industries, including roads and highways, pipelines, rail, and water/wastewater. While there are differences in how LRS is implemented within these industries, the core principles are the same: provide linear features representing the location of the asset to the Earth (the centerline); provide a standard method to reference features on the line (the linear referencing); and locate appurtenances to the line (the events).
Through the years, proprietary implementations of LRS systems have been built across many verticals. This has resulted in a complex and fragmented approach managing linear referenced data in a GIS. APR provides consolidation to this landscape, and lays the foundation for next-generation solutions.
What are the benefits to operators?
Operators stand to be the biggest benefactors of Esri providing a native pipeline referencing solution.
Standardization: With the history of divergent data model implementations in the pipeline space, operators are limited to software that supports their underlying LRS. APR is disrupting the status quo, and establishing a new baseline, ultimately allowing you to select the most effective solution for your needs, without the restriction of compatibility to your data model implementation.
Data Governance: The tight coupling of GIS and linear referencing benefits those relying on high-quality, accurate data. With the rise in GIS, and Esri driving the adoption, many GIS technicians and analysts are already knowledgeable in how to manage spatial data with Esri technology. Standardizing on APR provides operators with access to a broader range of technical staff that are capable of managing LRS data.
Consolidation: With APR, multiple linear referenced systems can be supported in the same model. Many operators manage assets across a large supply chain. Long-haul transmission lines have established patterns relying on engineering stationing, while those overseeing gathering and distribution assets tend to forgo true engineering stationing in lieu of routes and measures. Before APR, this meant implementing multiple data models, bringing with it varied methods for managing the data. Now you can consolidate: one system, one model, and one set of editors can manage all assets with the same tools.
What is UPDM (Utility and Pipeline Data Model) and how does it relate to APR?
UPDM is an accompanying data model to APR with a focus on unifying the implementation of vertically-focused operators. While UPDM provides an effective data repository, it is not the only data model compatible with APR. For example, the PODS Association is building the Next Generation model in compliance with APR.
The question then becomes, what data model should you choose for your implementation? I will be following up this post with a deep dive into the many data models that are available, and hopefully I can help you make better sense of these options.
Why are we engaged in this conversation?
Latitude Geographics provides solutions to gathering, midstream, transmission, and distribution clients around the world. In the past year, we’ve seen a sharp rise in the number of operators implementing and embracing APR. Geocortex has been 100% compatible with every implementation we’ve worked on, providing immediate value for clients leveraging PODS and UPDM.
With the release of powerful linear referenced toolsets, Geocortex can further your success with APR. Please reach out to firstname.lastname@example.org for more information.
Integrate key business systems to enhance your GIS
One of the most powerful capabilities of a GIS is the ability to integrate your mapping applications with 3rd party business systems and data sources. Integrating with key business systems extends the reach and capabilities of your applications, and ensures you and your team remain efficient and effortlessly informed.
So, why should you integrate your business systems with your GIS?
Added spatial awareness
Everything is somewhere, and there are patterns in data that can only be seen when looking at it on a map. Whether you need to view important asset data, or see key business intelligence data for a geographic area, the added element of spatial context takes your data to the next level.
Streamline data access
By centralizing access to important data in your mapping applications, you’re able to break down departmental siloes and improve data flow. Many organizations have important data that is difficult to access because it’s managed by another department. By integrating everything into a centralized application, data flows between systems and departments seamlessly.
The City of Brooklyn Park has realized the benefits of an integrated approach. Historically, City staff were limited to the databases and systems in their department. By integrating their assessment, licensing, land management, permitting, and police records into one internal mapping application, City staff can now access the data they need with just a few clicks.
We’ve also seen customers across other industries benefit significantly by integrating 3rd party business systems with their GIS. For example, we have helped oil and gas pipeline operators streamline their asset integrity programs. By integrating all the required data – which lives across multiple systems – into their Geocortex applications, many have been able to reduce overhead, simplify the audit process, and protect themselves from inevitable technology change.
Leave your data where it is
In most cases, you don’t even need to replicate, move or remodel your existing data to integrate it with your GIS. Geocortex Essentials offers varying levels of integration that allow you to leave your data where it lives, work with it directly on the map, and have it update the 3rd party system dynamically. It makes it simple to gain added spatial awareness and improve data flow between departments and systems.
What kinds of systems can you integrate with Geocortex?
The flexibility and extensibility of Geocortex allows you to integrate virtually any 3rd party business system or data source you’re currently using. Geocortex customers have integrated with ERP, asset and land management, document management, business intelligence, and CRM systems.
Whatever your business need is, chances are high that Geocortex can help you solve it. With a wide array of integration methods – from simple hyperlinking to custom API integrations – there is almost no limit to what you can achieve with Geocortex Essentials integrations.
Get in touch with us to learn how Geocortex can integrate with the system you’re using.
Vancouver Police Department and Geocortex: using machine learning to prevent crimes before they happen
Last week, the Vancouver Police Department (VPD) released their findings from a 6-month program they undertook with us in 2016 to pilot Geocortex Crime Forecasting, a state-of-the-art machine learning system that helps predict when and where crimes may occur.
The goal of VPD’s pilot program was to prevent break-ins and property crime throughout metro Vancouver, which typically spikes during the summer months when many people are away on vacation or leaving doors and windows open to beat the heat.
VPD focused on areas south of Broadway where residential break-ins are the most common. Special Constable Ryan Prox reported that their program reduced property crime by as much as 27% in some areas (compared to the past 4 years).
Geocortex Crime Forecasting turns simple, historic point crime data into powerful, actionable crime predictions. Predictions are quite precise; the system forecasts within a 100 m X 100 m (300’ X 300’) block, with a 2-hour time window, to determine where and when a particular type of crime might occur. By reviewing what took place in the field against predictions made for the same day, Vancouver’s Chief of Police said the system predicted with up to 80% accuracy.
With their successful pilot program complete, VPD has rolled out Geocortex Crime Forecasting to the entire police department: the first in Canada to adopt a “predictive policing” system.
The completion of the program also means that we will continue our investment in Geocortex Crime Forecasting and are excited to share it with other organizations this year!
To learn more about VPD’s program, check out the articles below:
If you want to learn more about Geocortex Crime Forecasting, we’d be happy to chat! You can contact us here.
Geocortex at the 2017 Esri User Conference
The Esri User Conference is right around the corner, taking place July 10-14 in beautiful San Diego, California. The Esri UC is our biggest event each year, and this is a particularly special year for Latitude Geographics, as we’ve launched one of our most noteworthy products in years – perhaps in the history of our company: Geocortex Workflow.
Geocortex Workflow 5 builds upon our extremely popular Geocortex Essentials workflow capabilities, and is offered as a standalone product. Geocortex Workflow 5 provides many new benefits, including: an overhauled design experience that makes building and maintaining workflows even more intuitive; the ability to run workflows in an offline environment (if using Geocortex viewers); and richer, more dynamic forms.
And for the first time in the history of Geocortex, you don’t just use our technology alongside Esri; you can leverage the power of Geocortex inside your Esri apps. Workflows authored with Geocortex Workflow 5 can run inside Esri’s Web AppBuilder for ArcGIS as a custom widget that you can acquire via the ArcGIS Marketplace.
Be among the first to explore and try out our next-generation workflow technology at one of our demo stations; we think you’ll be blown away. We’ll also be hosting a related fun activity at our booth (301) for the creative builder in everyone.
There will be many opportunities to see Geocortex Workflow – along with other exciting product enhancements -- in action during your time in San Diego:
- We’ll be in booth #301 all week! Team members from Sales, Partners, Support, Products and more will be available to chat with new and existing Geocortex customers. We will have dedicated pods for you to explore different Geocortex capabilities.
- Our annual Geocortex Technology Update will be taking place on Tuesday, July 11 from 10:30-11:30 AM in room 12 of the San Diego Convention Center. Join us to learn about key product developments, our product roadmap, and discuss our strategy with Esri’s modern technology.
- We have two dedicated Geocortex demos that will be happening in our booth. These fast-paced, hands-on demonstrations highlight the latest developments in Geocortex technology. Join as at booth #301 on Tuesday, July 11 or Wednesday, July 12 at 5:00 PM.
See you in San Diego!
When is it time to break up with a business process?
Far too often, organizations find themselves tied to inefficient, labor-intensive processes. Whether your end-users are stuck completing mindless and repetitive tasks, or they’re collecting data on paper (only to have to enter it in an archaic system in the office), it can make for a frustrating experience.
With advancements in technology, and GIS in particular, there is no reason to continue following manual, inefficient processes. There are several signs that a business process is just isn’t right for you anymore – see if any of them sound familiar.
End-users are experiencing difficulty
While this may be the most obvious sign that a process isn’t working, it’s often overlooked. When processes are complex or manual in nature, you’ll generally see end-users having difficulty completing their tasks and continually requiring assistance from the GIS department
This can slow down your workforce and bog you down with requests for assistance. One way to improve in this area is by simplifying the end-user experience; providing users with guided interactions makes for a much more pleasing experience and improves end-user success
You’re experiencing data integrity issues
Data integrity is an ongoing challenge in many organizations, and depending on the type of data being used, errors can have significant consequences. In many cases, poor data comes from using paper-based processes that are prone to transcription errors, or from not setting the proper parameters on the data being entered into an application.
At their core, GIS systems are about improving decision making. If your team doesn’t have proper information, they are not making properly informed decisions.
The best way to avoid this is to provide dedicated interfaces for data collection. Most technology will allow you to present only the data you require your field workers to collect, as well as set rules against the data being entered, to ensure it enters the system correctly. It’s unlikely that you’ll be able to completely avoid data errors, but following best practices can reduce them significantly.
Your users are spending a lot of time searching for information across disparate systems
Most organizations house important business data across several different systems; there are financial systems, asset management systems, document management systems, business intelligence systems, and many more.
If these systems are not properly integrated, it can become difficult for your end-users to track down all the information they need. It can also waste time and money if your users need to jump between systems to find information.
This pain can be alleviated with proper integrations between systems. Try making all the necessary data – regardless of which system it lives in – available in the application being used to manage a process.
For example, if your end-users need to complete fire inspections on buildings, don’t make them jump to a separate zoning system to collect information about a particular building. Instead, consider integrating the zoning system within the inspection application, and automatically present the necessary zoning info at a stage in the process that makes sense.
You’re struggling to organize and present important information to other departments and stakeholders
One of the most common, and valuable, use-cases of a GIS is conveying information to people so they can make informed decisions. If you find that your users are spending many hours collecting and compiling information into reports, it may be time to revisit the process.
Besides the obvious repetitive, inefficient nature of manually compiling reports, it takes your users away from high-value activities. Labor is expensive and you want to ensure you’re getting the most out of the investments in your team… they’ll be happier doing more fulfilling work, too!
It all comes back to decision making – if people don’t have the information they need, when they need it, and in an easy-to-interpret format, they won’t be making properly informed decisions.
When building out processes, it’s important to start with the end in mind. Consider GIS technology that allows you to auto-generate reports in different templates. Start with a vision for the report you want in mind, then tailor data collection activities based on your goals.
So, it’s time to break up - what do you do now?
If your organization is experiencing any of the challenges described above, chances are it’s time to break up with your business process(es). But that’s a big decision to make, so what do you do next?
A good start is to shadow the people responsible for completing a particular process (if it’s not you). If you’re in the office supporting a process for a different department, you may not get a full sense of everything that’s involved. It’s good to get in the trenches with your staff and really get an understanding of what it is they have to do to complete their work.
You can also try drawing your workflow out on paper, from end to end. By thinking critically about the entire flow of a process you’re able to identify areas that are ripe for improvement. We often get so tied up in the day-to-day demands of our jobs that we miss tasks that are repetitive or unnecessary. There may be steps that can be automated with the right technology, or removed entirely. Until you see the full picture, it’s difficult to determine where to adjust.