Welcome back, spatial nerds!
Today, we are excited to inform you that we have released a new feature: reading data into Location Intelligence from Snowflake!
We are sure most of you will want to know how to get the most out of this feature as soon as possible. 😊 There are a few things you need to know about spatial data in the cloud if you don’t know already.
1. Snowflake is a spatially enabled cloud database that enables us to store geometry / Geography columns. Geospatial Data Types - Snowflake This is basically the ability to store our points, lines, and areas in a column within our database.
a. Geometry – data type which represents a more 2D earth
b. Geography – data type which represents a spherical earth in familiar with WGS84 lat/long coordinates.
2. When in a database, you can utilise the database to apply spatial functions to the data (Geospatial Functions - Snowflake). Be aware only some functions work on the specific data type your data is stored in.
Now you are aware that this is possible, you may want to push some spatial data into your Snowflake database to play. I would recommend following this blog series to support that: Alteryx & Snowflake = Mind blown Speeds. For ultimate learning, ensure you have a point dataset and a polygon dataset that cover the same area to get the most out of what is coming next on this blog.
Only continue if you have spatial data in Snowflake 😉
Starting at our home page:
Let's navigate to the Data Tab:
Click IMPORT data at the top:
Once we are here, if you have not yet set up your connection to Snowflake, do this now. If you need support, please visit this link: Snowflake Connections on Alteryx Cloud Platform.
Within my Demo_Snowflake database, I have a schema dedicated to spatial data, so I am going to navigate to this.
Once here, I can see my datasets:
Click on the plus on the left-hand side of each dataset, which will enable you to add these to your available data within the platform. You can see the progress of their load on the right-hand side.
Once completed, you should be able to see a preview of the data, and in my case, as I only have a couple of columns, you can see the GeoJSON:
Click continue:
Now we can switch into our Location Intelligence application.
You can create a new project or open an existing one:
Once you can see your map, you can click Add Layers:
Within the layers, you can now see your new datasets and where it originates from:
Add your points layer and polygon layer to your map—hopefully, they both cover the same area 😉as you can guess what is coming next.
When they are first added to the map, they will default to any style, which you can change later.
But you can also see that the layers on the left show the type of spatial data the layer is as well as the spatial column that is mapped. This is because if there are multiple spatial columns within a dataset, you can choose which one is mapped on the map when loading it in.
Now let’s head to the analysis tab:
Once here, we can Summarize by Area:
Selecting your boundary layer as your polygon dataset and your data layer as your point dataset, you can count the number of points within each polygon very easily. Click Apply Analysis and Run.
A new layer is created within the platform with the results of your analysis, and it is automatically added to the map.
Again, a default style has been selected—let’s update that now. Click on the three dots menu of the layer on the left-hand side of your map and Edit Style.
To change a style, you need to select the column you want to style upon and select the new COUNT column within the dataset.
You can see that it has already been styled in a sequential colour range of 5 steps. You can adjust these to several colours and different styles, like diverging from the drop-down menus here. Once you are happy, return to the main layers menu.
Here we are going to turn off the original layers to give us a clean map to look at:
Voila, you have just completed spatial analysis to understand the number of stores within each county across the UK using data that resides within a Snowflake database.