I’ll be offering the one-week professional training course on Open Source GIS again through Hunter College Continuing Education. It will be running from January 25–29, 2016. We have changed the content of the course a bit, so that each day is a self-contained unit, and participants can book individual days if they are only interested in specific topics – or book the full week to get the full FOSS GIS treatment. This is the schedule:
- Monday: QGIS. Hands-on introduction to the most popular FOSS desktop GIS, covering setup, data management, spatial analysis, map design, digitizing, the plugin ecosystem, and integration with other FOSS GIS software.
- Tuesday: PostGIS. Introduction to the spatial extension to the popular PostgreSQL database. Participants will learn how to set up and use PostGIS as the central data storage in their GIS software stack, how to connect it to other software, and how to query the database using SQL.
- Wednesday: Geospatial Python. General introduction to the Python programming language with a specific focus on its capabilities for processing and analyzing geographic information, including automating GIS workflows, raster and vector data processing, and geocoding.
- Friday: Web Services. Introduction to Geospatial web services including OGC services such as the Web Map Service and the Web Feature Service, as well as tile services that deliver base maps for interactive web maps. Participants will learn how these different kinds of web services work and practice how to set up and configure different kinds of web services.
Head over to the Hunter Continuing Education website for information on prerequisites, pricing, and signup.
I wrote a little wrap-up of this year’s Google Summer of Code Mentor Summit over on the 52° North Blog.
The NYC Department of Health and Mental Hygiene has invited me to give a talk and be a discussant at their GIS day panel. The event will go down next Monday.
6th International Workshop on Location and the Web
April 11 or 12 2016
In conjunction with WWW 2016
25th International World Wide Web Conference
April 11- 15 2016, Montreal, Canada
Paper deadline ** December 22, 2015 **
Paper submission deadline: Dec 22, 2015
Paper acceptance notifications: Feb 02, 2016
Camera ready hard deadline: Feb 08, 2016
Workshop: April 11 or 12 2016
Call for Papers
Location has quickly moved into the mainstream of the (mobile) Web. It also continues to be a strong driver of applications and research activities. After the initial boost and consolidation of approaches based on the simple use of geospatial coordinates, we now see an increasing demand for more sophisticated location-based services, involving more powerful mechanisms in terms of information retrieval, mining, analytics and semantics. New application areas for Web architecture, such as the Internet of Things (IoT) and the Web of Things (WoT), also mean that there will be increasingly rich and large sets of resources for which location is highly relevant.
Following the successful LocWeb workshops in 2008, 2009, 2010, 2014, and 2015, LocWeb 2016 continues the workshop series, addressing issues at the intersection of location-based services and Web architecture. Its focus lies in Web-scale systems and services facilitating location-aware information access. The location topic is understood as a cross-cutting issue equally concerning Web information retrieval, semantics and standards, and Web-scale systems and services.
LocWeb is an integrated venue where the location aspect is discussed in depth within an interdisciplinary community. It is also highly interactive and collaborative, with ample room for discussion and demos that will explore and advance the geospatial topic in its various relevant areas. We expect the workshop to further the integration of the geospatial dimension into the Web, and promote challenging research questions.
LocWeb 2016 solicits submissions under the main theme of Web-scale Location-Aware Information Access. Subtopics include (i) geospatial semantics, systems, and standards; (ii) large-scale geospatial and geo-social ecosystems; (iii) mobility; (iv) location in the Internet/Web of Things; and (v) mining and searching geospatial data on the Web. The workshop encourages submissions describing Web-mediated or Web-scale approaches that build on reliable foundations, and that thoroughly understand and embrace the geospatial dimension.
Topics of Interest
– Location-Aware Information Access
– Location-Aware Web-Scale Systems and Services
– Location in the Web of Things
– Large-scale Geospatial Ecosystems
– Standards for Location and Mobility Data
– Location in Unstructured and Semi-Structured Information Sources
– Location Semantics
– Modeling Location and Location Interaction
– Geo-Social Media and Systems
– Location-Based Social Networks
– Geospatial Web Search and Mining
– Visual Analytics of Geospatial Data on the Web
– Location-Based Recommendation
– Mobile Search and Recommendation
We solicit full papers of up to 8 pages, and short papers of up to 4 pages describing work-in-progress or early results. Authors are invited to submit original, unpublished research that is not being considered for publication in any other forum.
Workshop submissions will be evaluated based on the quality of the work, originality, match to the workshop themes, technical merit, and their potential to inspire interesting discussions. The review process is single blind, so please provide your name and affiliation.
Manuscripts should be formatted using the ACM camera-ready templates (http://www2016.ca/calls-for-papers/call-for-research-papers.html) and submitted in PDF format to EasyChair at https://www.easychair.org/conferences/?conf=locweb2016
Accepted workshop papers will be published in the WWW companion proceedings and will be available from the ACM Digital Library. These may be regarded as prior publications by other conferences or journals.
For inclusion in the proceedings, at least one author of the accepted paper has to register for the workshop.
Presenters are encouraged to bring demos to the workshop to facilitate discussion.
Dirk Ahlers, NTNU, Norway
Erik Wilde, Siemens, USA
Bruno Martins, University of Lisbon, Portugal
Technical Program Committee (tentative)
Andreas Henrich, Universität Bamberg, Germany
Arjen de Vries, CWI, Netherlands
Bruno Martins, University of Lisbon, Portugal
Carsten Keßler, CUNY, USA
Chandan Kumar, University Koblenz-Landau, Germany
Christoph Trattner, Graz University of Technology, Austria
Christopher Jones, Cardiff University, UK
Claudia Hauff, Delft University, Netherlands
Clodoveu Davis, Universidade Federal de Minas Gerais, Brazil
Dirk Ahlers, NTNU, Norway
Erik Wilde, Siemens, USA
Francisco López-Pellicer, Universidad Zaragoza, Spain
Massimiliano Ruocco, NTNU, Norway
Max Egenhofer, University of Maine, USA
Rainer Simon, AIT Austrian Institute for Technology
Ross Purves, Universität Zürich, Switzerland
Steven Schockaert, Cardiff University, UK
Torsten Suel, New York University, USA
Vanessa Murdock, Bing, USA
Yana Volkovich, Centro Tecnológico de Catalunya, Spain
This is the most funny thing you will read this weekend:
The plan: get a robot arm, have it pour cereal and milk into a bowl and feed it to me with a spoon. The only catch was that I had to learn how to program a robot arm and code a pretty complicated sequence to have it serve me breakfast. But I was up for the challenge, because the best way to avoid real problems is to deal with fake ones.
Carsten Keßler and Carson J. Q. Farmer (2015) Querying and integrating spatial–temporal information on the Web of Data via time geography. Journal of Web Semantics, in press. DOI: 10.1016/j.websem.2015.09.005
Abstract: The Web of Data is a rapidly growing collection of datasets from a wide range of domains, many of which have spatial–temporal aspects. Hägerstrand’s time geography has proven useful for thinking about and understanding the movements and spatial–temporal constraints of humans. In this paper, we explore time geography as a means of querying and integrating multiple spatial–temporal data sources. We formalize the concept of the space–time prism as an ontology design pattern to use as a framework for understanding and representing constraints and interactions between entities in space and time. We build on a formalization of space–time prisms and apply it in the context of the Web of Data, making it usable across multiple domains and topics. We demonstrate the utility of this approach through two use cases from the domains of environmental monitoring and cultural heritage, showing how space–time prisms enable spatial–temporal and semantic reasoning directly on distributed data sources.
The US Census provides an incredible wealth of data but it’s not always easy to work with it. In the past, working with the tabular and spatial census data generally meant downloading a table from FactFinder and a shapefile from the boundary files site and joining the two, perhaps in a GIS system. These files could also be handled in R but getting the data, reading it into R and, in particular, merging tabular and spatial data can be a chore. Working with slots and all the different classes of data and functions can be challenging.
A recent interesting post on stackoverflow by Claire Salloum prompted me to revisit this issue in R and I’ve definitely found some valuable new packages for capturing and manipulating Census data in R.
Great post explaining how to wrangle and map census data in R.
ESWC is one of the key academic conferences to present research results and new developments in the area of the Semantic Web. For its 13th edition, ESWC will be back in Hersonissou, Crete, between Sunday May 29th and Thursday June 2nd 2016.
This time, ESWC will feature a special track on smart cities, urban and geospatial data:
More than half of the world’s population is already living in urban areas today. UN projections show that this proportion will grow to 66% by 2050, adding another 2.5 billion people to our cities. Geospatial data provided by sensor networks, different remote sensing technologies, citizen scientists, social networks, as well as Open Data initiatives helps cities address these challenges and transform into smart cities.
However, in such a diversity of information, it is a fact that large amounts of valuable open data and sensor information remain unused, and aggregation of information from various sources is typically limited to specific application domains, with organizations and cities reaping the benefits often only after extensive investments. With the very most of the world’s information today still handled in siloes, there is an enormous potential for better information management, search, discovery and reuse of heterogeneous urban data using Semantic Technologies, in order to make cities more intelligent, innovative and integrated beyond the boundaries of isolated applications.
In this track, we invite submissions that address the use of Semantic Web technologies in the context of this transformation process. Submissions to this track should contain original, unpublished research that shows how urban and smart city applications can benefit from Semantic Web technologies. Authors are strongly encouraged to include concrete application examples, ideally using real data, in their papers. Papers in this track will be evaluated on the basis of the impact of semantic technologies in the society and the extent to which they address real-life problems in the context of cities. Papers are also expected to evaluate or provide a deeper insight on the significant advantages of a semantic solution over state of the art, common practitioner no semantic solutions.
- Semantic integration and processing of remotely sensed data and data from in-situ sensors
Semantic models for spatial-temporal change
- The city as an API
- Semantics of urban sensor networks
- Semantic integration of distributed urban data
- Semantic analysis of data streams
- Semantic Web applications addressing urban topics such as transport, energy, building, safety, water, food, waste, or emissions
- Semantics for citizen-centric Smart cities
- Application of semantic technologies, sensors and semantic streams for e-Health, Life Sciences, e-Government, Environmental Monitoring, Cultural Heritage, Utility Services or Social Sensing
- Intelligent User Interfaces and Interaction Paradigms that profit from semantics and knowledge graphs over Web Data, open government and corporate data relating to cities
- Context- and location-aware (mobile) applications based on semantic technologies and geo-semantics
- Provenance, access control, trust and privacy-preserving issues in smart cities
- Semantic-based cloud applications for Smart Cities
- Semantic reasoning, event detection, knowledge extraction and analytics for smart city platforms
- Big data and scaling out in semantic cities. Managing real time and historical city data using knowledge representation models
- Semantic platforms, knowledge acquisition, publishing, consumption, evolution and maintenance of city data
All deadlines are at 23:59 Hawaii Time.
Compulsory abstract submission for all papers: Friday 11th December 2015
Compulsory full paper submission: Friday 18th December 2015
Authors rebuttal: Friday 29th Jan – Friday 5th Feb 2016
Acceptance notification: Monday 22nd February 2016
Camera ready: Monday 7th of March 2016
Strabon is a triple store that has been developed with a focus on spatio-temporal query functions. I’ve meant to play around with Strabon for a while, and while at ISWC earlier this week, I decided to finally give it a shot. There are no instructions for setting up Strabon on Mac OS in the user guide, so here’s what I did to get it running:
If you don’t have Homebrew installed yet, open a terminal window and paste the following command to install it:
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
brew install maven mercurial postgres postgis
Don’t worry if you have any of those already installed, Homebrew is smart enough to figure that out and will just skip over those.
Next, we’ll initialize a new PostGIS database folder and start PostGIS:
initdb strabon postgres -D strabon
The terminal window will now be “occupied” by PostGIS (press Control+C to shut down PostGIS when you are done), so open a new terminal window (⌘N) to continue there. Next, we’ll create the actual database and enable the PostGIS extensions on it:
createdb -E UTF8 -T template0 template_postgis psql -d template_postgis -c "CREATE EXTENSION postgis;"
Next, we’ll adjust the permissions on that database so that Strabon can make changes later, do some housekeeping, and create another database for the endpoint:
psql -d template_postgis -c "GRANT ALL ON geometry_columns TO PUBLIC;" psql -d template_postgis -c "GRANT ALL ON geography_columns TO PUBLIC;" psql -d template_postgis -c "GRANT ALL ON spatial_ref_sys TO PUBLIC;" psql -d template_postgis -c "VACUUM FULL;" psql -d template_postgis -c "VACUUM FREEZE;" psql -d postgres -c "UPDATE pg_database SET datistemplate='true' WHERE datname='template_postgis';" psql -d postgres -c "UPDATE pg_database SET datallowconn='false' WHERE datname='template_postgis';" createdb endpoint -T template_postgis
That’s all for a PostGres database with spatial extension. Next, we’ll install the temporal extension (you can skip over this part if you are only interested in the spatial functions). We’ll download, build, and install the temporal extension like so:
git clone https://github.com/jeff-davis/PostgreSQL-Temporal.git cd PostgreSQL-Temporal make make install make installcheck psql -d endpoint -c "CREATE EXTENSION temporal;"
Next, we’ll install Tomcat, a servlet container that we’ll use to host Strabon, and modify the Tomcat users. Note that if you already have Tomcat and other servlets are running in it, you should just add a user to the Tomcat configuration and skip this step:
cd ~ curl http://mirror.symnds.com/software/Apache/tomcat/tomcat-8/v8.0.28/bin/apache-tomcat-8.0.28.zip > apache-tomcat-8.0.28.zip unzip apache-tomcat-8.0.28.zip rm apache-tomcat-8.0.28.zip cd apache-tomcat-8.0.28/conf mv tomcat-users.xml tomcat-users-backup.xml curl https://gist.githubusercontent.com/crstn/2daac7483501c8b8beda/raw/26d66cbfaec03cb8b36bedd895c6c4e4955ffe7e/tomcat-users.xml > tomcat-users.xml
And start Tomcat:
cd ../bin sh catalina.sh start
Now we’re done setting up the PostGIS backend and Tomcat, let’s move on to download and compile Strabon:
cd ~ hg clone http://hg.strabon.di.uoa.gr/Strabon/ cd Strabon hg update temporals mvn clean package
Don’t worry if you get an error message here for
Strabon: Executable endpoint. This part tries to automatically deploy the compiled code in Tomcat, we’ll do that manually:
cp endpoint/target/strabon-endpoint-*.war ~/apache-tomcat-8.0.28/webapps/strabonendpoint.war
That’s all. At this point, you should be able to visit http://localhost:8080/strabonendpoint/ in your browser. Before you can load data into Strabon and write stSPARQL queries, there is one last bit of configuration to do. On the web interface, go to Explore/Modify Operations > Configuration and enter the following information (assuming you used the information exactly as provided above):
- Database Name: endpoint
- Username: test
- Password: test
- Port: 5432
- Hostname: localhost
- Database Backend: postgis
Since we didn’t set up the database with a password, you can put in any username and password you want, but the fields can’t be empty (this seems to be a JDBC issue). If Strabon can connect to PostGIS with the configuration parameters provided, it will take you to the query page after you click connect. If there is something wrong with your parameters, nothing happens – you will just stay on the configuration page, but there is no error message (it took me a while to figure this out…). If that happens, take a look at the terminal window running PostGIS and at the Tomcat logs in
~/apache-tomcat-8.0.28/logs/, which should give you an idea of what’s going wrong.