Researchers

IT Services Blog

IT services blog

Open location mapping for finding stairs and elevators in public spaces

by Anson Parker on 2020-07-24T22:27:00-04:00 | 0 Comments

Working in the library there are many opportunities to use our technical resources to research ways of improving conditions for all of our patrons.  As part of our technical support for GIS and data analytics tools, as well as our ongoing commitment to proactively addressing ADA concerns and values this staff-led research highlights the value of open source, open standards, and reproducibility in an effort to be useful for other colleagues interested in the effort.  

Tools gettin' used

  • Jupyter python - look at why these public python notebooks are so helpful in data science
  • Python and GeoPandas - using data science tools with geospatial data
  • Movingpandas.org - a motion tracking and trajectory algorithm. 
  • QGIS - do quick GIS visualizations and queries to check test hypothesis before programming
  • Kaggle - adding data, working with their notebooks

Wayfinding for blind and wheelchair enabled people predictably presents a wide array of challenges.  As anyone who knows tools like Google or Apple Maps they are of limited use inside of buildings.  Challenges arise from lack of accurate floor plans on one hand and GPS signals requiring line-of-sight for best results.

Among the greatest difficulties encountered while navigating buildings is getting from one floor to another.  If you are wheelchair enabled you will absolutely prefer using an elevator, whereas if you are blind elevators often represent one of the worst possible modes of transportation "I thought I was going up and ended up in a laundry room" quoted a colleague discussing the issues.

Using cell phone data GPS and altitude data we can parse out floor level and location to within ~5 meters on latitude longitude, however on the z axis we are able to resolve to about 1 meter or better - meaning we are able to parse floors from our data effectively and subsequently identify approximate locations of stairs and elevators on maps "with additional computation..." (still a lot of room for improvement here...)

5 meters is sufficiently useful as is for wheelchair enabled people - however for blind and visually impaired 5 meters may as well be a mile.  The proposal is that within 5 meters a user could start an AI camera that would be able to start looking for stairs... cameras might be able to collect data on elevator button pads and add that content in to a global mapping system like open street map.

Alternately a group like http://aira.io/ could step in when the app ceases to be of significant utility

Method 
We work in a library, and have therefore made a strong effort to do things in a standardized, transparent, and reproducible way.  Our data gathering tool, Traccar.org allows us to provide real-time maps for end users

undefined  

but more importantly Traccar allows for data exports for analysis plus APIs for research scientists

 

undefined 
(This is a first draft of a trajectory map) made with http://movingpandas.org/ )


We provide our content in geojson, and we're working with QGIS, the http://movingpandas.org/, and https://scikit-mobility.github.io/scikit-mobility/ algorithms to analyze the data.

scatter3d plot with stairs hilighted

Challenges and next steps

  • Consider making a streamlit.io app to making understanding algorithms better?
  • Figure out best practices for uploading feature data in to open street map - the data should be stored in a publicly available and useful place for users and developers... OSM satisfies these criteria.
  • The altitude data is accurate to itself, but may be improved by calibration - maybe working with SRTM data could help that?
  • Still need to get estimates for area completeness (ie when it's safe to say that an elevator or stairwell has been found from GPS tracking)

Would you like to analyze the data? Our initial data set is on kaggle here; there are some jupyter notebooks to convert the traccar exports over to geojson and now we're working on using the traccar API to do that work. If you give it a try, please let us know. We'd be interested in your results!

Here's a recent python 3 code dump - explained

import sqlalchemy as db #grab the traccar cell phone gps data straight from the postgres database.. great performance, great for large data sets
import pandas as pd # does the data sciencey stuff
import geopandas as gpd  #does the data science stuff with geo-located data
import osmnx as ox #this library grabs our open street map data - in this case the library
import plotly.express as px #draws our scatter plot at the end

engine = db.create_engine('postgresql+psycopg2://traccar:traccar@localhost:5432/postgis_traccar_db') #connect to the local database
connection = engine.connect()
metadata = db.MetaData()
positions = db.Table('tc_positions', metadata, autoload=True, autoload_with=engine) #create a sqlalchemy query
query = db.select([positions]) #run the query

ResultProxy = connection.execute(query)
ResultSet = ResultProxy.fetchall()
df = pd.DataFrame(ResultSet)

points = gpd.GeoDataFrame(
    df, geometry=geopandas.points_from_xy(df[8], df[7], df[9])). #lazy here, should probably rename columns at some point?

points.crs = {'init': 'epsg:4326'} #set a CRS 

place_name = "Claude Moore Health Sciences Library"
get_place = ox.geocoder.geocode_to_gdf(place_name)
place
place.crs = {'init': 'epsg:4326'}

within_points = gpd.sjoin(points, place, op = 'within')
fig = px.scatter_3d(within_points, x=8, y=7, z=9)
fig.show()

 


 Add a Comment

0 Comments.

  Return to Blog
This post is closed for further discussion.

Skip to Main Content

Claude Moore Health Sciences Library
1350 Jefferson Park Avenue P.O. Box 800722
Charlottesville, VA 22908 (Directions)

facebook twitter instagram
© 2024 by the Rector and Visitors of the University of Virginia
Copyright & Privacy