Geography 2050 – Future of Mobility

Every November the American Geographical Society holds its flagship Fall event, the Geography 2050 symposium. Designed to be a multi-year strategic dialog on the trends that will affect the planet in 2050, the theme for 2017 was the Geography 2050: The Future of Mobility.

In the Geography 2050 sequence, we envisioned the world of 2050, then focused on the inter-related topics of urban growth, sustainability, mobility, and next year will be energy.  For each event we assemble a highly curated set of sessions and speakers that combine a geographic perspective with the academic, government, and business communities in a way that no other event replicates.

Background

In order to understand the roots of Geography 2050, and what we are trying to accomplish as the American Geographical Society (AGS), some backstory is needed. Starting four years ago, the AGS began a dramatic revitalization effort, one that would bring a significant influx of new energy, people, and ideas into the Society. Spearheaded by the now AGS Chairman, Dr. Chris Tucker, the idea of AGS 2.0 has taken root…with Geography 2050 one of its central concepts. This video, from the initial Geography 2050, describes the 2050 concept in more detail:

This synthesis is the hallmark of the AGS, and a differentiating factor in the discussions we facilitate.  AGS history represents some of the best applications of the geographic perspective over the last 165 years, whether that be exploration, commerce, diplomacy, defense, or education. As we envision the role of AGS in the 21st century, we see a United States in need of greater geographic awareness. We see one role of AGS as a means to help foster a national dialogue on spatial literacy, and be a force for change in geographic education. These values are summed up well by AGS President Emeritus (and my Ph.D. advisor), Dr. Jerome Dobson, in this short video from the original Geography2050:

Highlights

The Future of Mobility event continued this trend, and the quality of sessions may have even exceeded previous years (kudos to the Dean Wise, AGS Councilor and the conference chair). It was hard not to leave the conference amazed at the pace of change in the mobility sector, and how close this next wave of technologies is to fundamentally changing how we are spatially organized. Videos for the sessions are being finalized now, and I’ll update when they are complete (see the Program here). Below are a few of my takeaways from the conference (and since I’m not an expert in mobility and transportation, I learned a tremendous amount from the sessions).

Clearly electric vehicles are already here, but when combined with longer battery life, increasing levels of automation, and “transport as a service” business models, the economics of automobile ownership and utilization change dramatically. The impact of this change on individual behavior and public planning for infrastructure leads to some widely divergent models that swing between transport utopia and massively underfunded public transportation.  And this does not account for the potential of automated trucking and freight transport. We are already on the cusp of these changes, and even some of the best minds in transportation planning don’t really know what is going to happen in the next 10 years…let alone the 33 years till 2050.

Vertical take-off and landing (VTOL), flying cars, and other personal “jet pack” devices are becoming a reality. At this point, each of these are still piloted by humans, but clearly have the potential to become more and more autonomous (or remotely piloted). The already stressed regulatory environment for simply figuring out how to handle “beyond line of sight” human-flown drones, is going to be incredibly burdened as autonomous package delivery, flying taxi service, and personal VTOL devices become more common. The domestic and international airspace management regulations are incredibly complex, and accommodating these new forms of transport will be a generational challenge.

Hyperloop…wow…paradigm shift in the making. The potential of this technology is so absolutely incredible, it is truly difficult to appreciate the implications on human society. If we have any political will to accomplish great things left in this country, we should move as fast as possible towards Hyperloop. America’s bet on the automobile has been an incredible benefit to our society, and yet it has also locked us into a transport paradigm that is strangling our cities. From everything I saw, Hyperloop is the only technology that could fundamentally change the equation. The sheer speed that it travels will make disparate cities the same travel time away as current metro stops. To view some of the proposed Hyperloop routes and travel times between stops, check out the winners of the Global Route Challenge…with interactive maps.

Honoring Digital Cartography

On a personal level, this year was the first time that I’ve participated in Geography 2050 as an AGS Councilor (I spoke at the first event in 2014 before I joined the AGS Council). Having known and studied under Dr. Dobson since his arrival at the University of Kansas in 2001, I have been long steeped in AGS and its proud traditions. When Chris first proposed AGS 2.0, and started to invite new Councilors, I knew I wanted to be part of bringing new energy into the Society.

What I bring to AGS is an expertise in geographic information science and technology. Building on that background, my goal as a Councilor is to ensure AGS continues to be at the fore of geographic technology, and that the Society begin to formally recognize the contributions of those who are responsible for powering the “geospatial revolution”. Over the last 20 years geographic technologies have revolutionized our society (Global Positioning Systems, high resolution satellite imagery, in-car navigation, interactive web maps, etc…), and it is important that the Society formally acknowledge those in government, industry, and academia who contributed.

To that end, the 2017 edition of Geography 2050 marked the first attempt to do this. I believe there are many people who deserve to be honored for their contributions to the post-millennium explosion in digital geography, so nominating the right people had to be a mix of contributions and applicability to this year’s 2050 theme of mobility. Given that combination, it was clear that Brian McClendon and John Hanke were the right choices to nominate for the AGS O.M. Miller Cartographic Medal. I made the nomination, and the AGS Vice President and Chair of the Awards committee, Deborah Popper, wrote up two wonderful award letters that were read at the ceremony.

A Conversation with John Hanke and Brian McClendon

Geography 2050 panel: Dr Campbell (left), Mr McClendon (center), Mr Hanke (right). From https://twitter.com/tomfitzwater/status/931604897371475968

In addition to the award, both Brian and John were kind enough to participate in a Geography 2050 session, which was formatted as a conversation with them. It was an honor to moderate this session, a job that required I ask a couple opening questions and then get out of their way. I don’t get anxious speaking in front of crowds often, but have to admit, this one was a bit nerve-racking. But ultimately I think the session went well, and there was a lot of great feedback.

Update: Video of the session has been posted here and embedded below:

The goal of my questions was to first look back on their Keyhole / Google Earth experience, using it as a historical lens to view current trends. Then to discuss trends in autonomy and augmented reality that are affecting mobility. Co.Design (a Fast Company subsidiary) wrote an article on the session that has more detail.

I’d like to thank both of them personally for taking the time out of their schedules to come to New York to accept the AGS Miller Medal and participate in the panel. I certainly learned a lot, and feel that as AGS, we are off to a good start in honoring the pioneers of the modern digital geography movement.

If interested in additional information, Trajectory Magazine recently published an article on the history of Google Earth that provides great background on the evolution of the technology, and its role in the national security context.

Print Friendly, PDF & Email

OpenStreetMap in Africa (2013-1015), beautifully visualized

The Ito World crew is back at it with a new OpenStreetMap visualization, this time for Africa. Results are shown at the continental scale and for selected cities over the last couple years. The final product is stunning, as usual.

Growth in West Africa as part of the Ebola response, and the Nigeria eHealth Import are the most distinctive. Other growth areas include a broad swath of East Africa, and the incredible density of the Map Lesotho project. Most impressive, however, is that the growth is not constrained to these areas; it is distributed across the continent. The missing areas are gradually filling in, it is only a matter of time…

Print Friendly, PDF & Email

A decade of OpenStreetMap, beautifully visualized

Adding to the collection of amazing OpenStreetMap animations, the folks at Scout worked with the Ito World team to create a new addition to the “Year in Edits” series. Celebrating the 10 year anniversary of OpenStreetMap, the new video looks at the growth in OSM between 2004 to 2014 . As I’ve blogged before, I love these videos. The production quality is high, music is great, and they provide an easy way to communicate how amazing the OSM database has become.  Kudos to all the volunteer mappers out there.

 

Print Friendly, PDF & Email

Why Maps Matter

Back in March 2014, FCW published two articles written by Frank Konkel that mentioned the HIU’s work with digital mapping. The first, entitled “Why Maps Matter“, is a good summary piece that reviews how geographic technology is used in various U.S. government agencies. The key points are the growing recognition in the government that visualization is a powerful tool for policy making, and how new companies are making it significantly easier for new users to leverage geographic technology. Beyond discussing the HIU, case studies from the Federal Communications Commission (FCC), Capitol Hill, National Park Service (NPS), and the National Geospatial-Intelligence Agency (NGA) are mentioned.

The second article, State Department: Mapping the humanitarian crisis in Syria, is a shorter piece that focuses solely on the HIU and its work in mapping the Syria humanitarian crisis. Having worked closely on Syria for two years, I can say we put a tremendous amount of effort into building comprehensive refugee datasets, verifying data from news reports, NGO reports, and commercial satellite imagery. Additionally we built inter-agency compatible data schema that leveraged geographic locations and P-Codes for information integration (P-Code dataset, P-Code Viewer). And to visualize it all, we built custom web mapping applications with tools to interactively explore all of the data across time and space.  A significant portion of the HIU work on Syria (and now Iraq) is available on the HIU Middle East Products page, additionally, the data used for the Refugee and Internally Displaced Peoples layers are available for download on the HIU Data page.

It is clear the appreciation and value for geographical data, analysis, and visualization are on the rise; FCW lists the the Why Maps Matter article as the 3rd most popular of 2014. Fully recognizing the value of geography requires that the notion of maps as “pieces of paper” must be replaced with an appreciation and use of geographic data and spatial analysis as a tool of policy formation. This change is happening, albeit slower than I’d like, but its adoption will result in better, more agile policy, and benefit the government and citizens alike.

Clip of Syria map produced by the HIU, full map available here - https://hiu.state.gov/Products/Syria_DisplacementRefugees_2014Oct23_HIU_U1109.pdf
Clip of Syria map produced by the HIU, full map available here – https://hiu.state.gov/Products/Syria_DisplacementRefugees_2014Oct23_HIU_U1109.pdf
Print Friendly, PDF & Email

QGIS 2.0 Update // Install on Windows

In response to my previous post on the challenges of installing the new QGIS 2.0 version, I wanted to highlight a new post by the folks over at Digital Geography. Not only did they write a good post on the Ubuntu install of QGIS 2.0 (now updated with an install video), but they have followed up with a summary of the Windows install of QGIS 2.0.

I posted a comment about my concerns regarding the automatic installation of the SAGA and OTG dependencies, and Riccardo answered that the Windows install does include both. I haven’t tested it yet, but automatically including them would be great. Would appreciate if anyone could confirm this in the Windows install, and if the Ubuntu install has been updated.

Links:
https://www.disruptivegeo.com/2013/09/qgis-2-0-and-the-open-source-learning-curve/
http://www.digital-geography.com/
http://www.digital-geography.com/install-qgis-2-0-on-ubuntu/
http://www.digital-geography.com/installation-of-qgis-2-0-dufour-on-windows/

Print Friendly, PDF & Email

QGIS 2.0 and the Open Source Learning Curve

As likely everyone in the geo world knows by now, the widely awaited release of QGIS 2.0 was announced at FOSS4G this week. Having closely watched the development of QGIS 2.0 I was eager to get it installed and take a look at the new features. I am for the most part a Windows user, but I like to take the opportunity to work with the open source geo stack in a Linux environment. My Linux skills are better than a beginner, but because I don’t work with it on a daily basis, those skills are rusty. So I decided this is a good excuse to build out a new Ubuntu virtual machine, as I haven’t upgraded since 10.04.

Build Out The Virtual Machine

Let’s just say I was quickly faced with the challenges of getting Ubuntu 12.04 installed in VMware Workstation 7.1. Without getting into details, the new notions of “Easy Install” and its effect on installing VMware Tools took a while to figure out. In case anyone has the Easy Install problem, this YouTube video has the answer, also posted in code below:

sudo mv /etc/rc.local.backup /etc/rc.local
sudo mv /opt/vmware-tools-installer/lightdm.conf /etc/init
reboot the virtual machine

The second problem, about configuring VMware Tools with the correct headers, is tougher. I’m still not sure I fixed it, but given its rating on Ask Ubuntu, it’s obviously a problem people struggle with. Suffice it to say there is no way that I could have fixed this on my own, nor do I really have a good understanding of what was wrong, or what was done to fix it.

The Open Source Learning Curve

This leads me to the point with this post, open source is still really tricky to work with. The challenges in getting over the learning curve leads to two simultaneous feelings: one, absolute confusion at the problems that occur, and two, absolute amazement at the number of people who are smart enough to have figured them out and posted about it on forums to help the rest of us. While its clear the usability of open source is getting markedly better (even from my novice perspective), it still feels as if there is a “hazing” process that I hope can be overcome.

QGIS 2.0 Installation

With that in mind, here is some of what I discovered in my attempt to get QGIS 2.0 installed on the new Ubuntu 12.04 virtual machine. First, I followed the directions from the QGIS installation directions, but unfortunately I was getting errors in the Terminal related to the QGIS Public Key, and a second set on load. Then I discovered the Digital Geography blog post that had the same directions, plus a little magic command at the end (see below) that got rid of the on load errors.

sudo chown -R "your username" ~/.qgis2

After viewing a shapefile and briefly reviewing the new cartography tools, I wanted to look at the new analytical tools. But there was a problem, and they didn’t load correctly. So back to the Google, and after searching for a few minutes I found the Linfinty Blog on making Ubuntu 12.04 work with QGIS Sextante. After a few minutes downloading tools I check again, this time the image processing tools are working, but I keep getting a SAGA configuration error. After figuring out what SAGA GIS is, and reading the SAGA help page from the error dialog box, it was back to searching. The key this time was a GIS Stack Exchange post on configuring SAGA and GRASS in Sextante, this time the code snippet of glory is:

sudo ln -s /usr/lib64/saga /usr/lib/saga

Next, back to QGIS to fire up a SAGA powered processing routine, and the same error strikes again. More searching and reading, now its about differences in SAGA versions and what’s in Ubuntu repos. With no clear answer, I check the Processing options in which there is an “Options and Configuration”, which has a “Providers” section, which as an “Enable SAGA 2.0.8 Compatibility” option.  Upon selecting the option, boom, the all the Processing functions open. Mind you I haven’t checked whether they actually work, after several hours they now at least open.

"Processing" configuration options in QGIS 2.0
“Processing” configuration options in QGIS 2.0

Conclusion

It is clear that the evolution of open source geo took several steps forward in this last week, and the pieces are converging into what will be a true end-to-end, enterprise-grade, geographic toolkit. And while usability is improving, in my opinion, it is still the biggest weakness. Given the high switching costs for organizations and individuals to adopt and learn open source geo, the community must continue to drive down those costs. It is unfair to ask the individual developers who commit to these projects to fix the usability issues, as they are smart enough to work around them. Seems to me it is now up to the commercial users to dedicate resources to making the stack more usable.

Key Resources:
http://changelog.linfiniti.com/version/1/
http://www.youtube.com/watch?v=fmyRbxYyOmU
http://askubuntu.com/questions/40979/what-is-the-path-to-the-kernel-headers-so-i-can-install-vmware
http://www.digital-geography.com/install-qgis-2-0-on-ubuntu/
http://linfiniti.com/2012/09/quick-setup-of-ubuntu-12-04-to-work-with-qgis-sextante/
http://gis.stackexchange.com/questions/60078/saga-grass-configuration-in-sextante-ubuntu

Print Friendly, PDF & Email

The Disruptive Potential of GIS 2.0

‘Disruption is a theory: a conceptual model of cause and effect that makes it possible to better predict the outcomes of competitive battles in different circumstances’ — The Innovators Solution  

My PhD dissertation at the University of Kansas is entitled “The Disruptive Potential of GIS 2.0: An application in the humanitarian domain”. The research involves several interrelated philosophical, technological, and methodological components, but at its core, it is about building a new way to harness the power of geographic analysis. In short, the idea is to show how Geographic Information Systems (GIS) has evolved into something different than it was before, explore the dynamics of that evolution, then build new tools and methods that capitalize on those dynamics.

The foundation of the argument is that a new generation of digital geographic tools, defined here as GIS 2.0, have completely changed how core GIS processes are implemented. While the core functions of a GIS remain the same — the creation, storage, analysis, visualization, and dissemination of geographic data — the number of software packages capable of implementing spatial functions and the distribution capacity of the Internet have fundamentally changed the desktop GIS paradigm. Driving GIS 2.0 is a converging set of technology trends including open source software, decreasing computation costs, ubiquitous data networks, mobile phones, location-based services, spatial database, and cloud computing.The most significant, open source software, has dramatically expanded access to geographic data and spatial analysis by lowering the barrier to entry into geographic computing. This expansion is leading to a new set of business models and organizations built around geographic data and analysis. Understanding how and why these trends converged, and what it means for the future, requires a conceptual framework that embeds the ideas of the Open Source Paradigm Shift and Commons-based Peer Production within the larger context of Disruptive Innovation Theory .

While there is a philosophical element to this argument, the goal of the dissertation is to utilize the insights provided by disruptive innovation theory to build geographic systems and processes that can actually make a difference in how the humanitarian community responds to a complex emergency. It has been long recognized that geographic analysis can benefit the coordination and response to complex emergencies , yet the deployment of GIS has been hampered by a set of issues related to cost, training, data quality, and data collection standards . Using GIS 2.0 concepts there is an opportunity to overcome these issues, but doing so requires new technological and methodological approaches. With utility as a goal, the research is structured around general three sections:

  1. GIS 2.0 Philosophy: Exploring the fundamental reorganization of GIS processes, and building a conceptual model, based on disruptive innovation theory, for explaining that evolution and predicting future changes
  2. GIS 2.0 Technology: Utilizing GIS 2.0 concepts build the “CyberGIS”, a geographic computing infrastructure constructed entirely from free and open source software
  3. GIS 2.0 Methodology: Leverage the CyberGIS and GIS 2.0 concepts to build the “Imagery to the Crowd” process, a new methodology for crowdsourcing geographic data that can be deployed in a range of humanitarian applications

In the next series of posts I will explore each of the points above. My goal is to complete the dissertation in the coming months and I want to use this blog as a staging ground for drafts, chapters, and articles that can be submitted to my committee. As such they will likely be a bit rough. I am a perfectionist in my writing, which only serves to completely slow down my productivity, so hopefully this will force me to “release early and often.”

The core arguments of GIS 2.0 were originally conceived during 2006-2008, so they are a bit dated now. At the time there was really only anecdotal evidence to support the argument that the same Web 2.0 forces  that built Wikipedia and disrupted the encyclopedia market were going to impact GIS. However, with the continued rise of FOSS4G, OpenStreetMap, and now the Humanitarian OpenStreetMap Team (HOT), it feels almost redundant to be making this argument now. Additionally, from the technology perspective there are lots of individuals and groups out there doing more cutting edge work than I ever will, but I hope the combination of philosophical approach and actual implementation can be a contribution to the discipline of geography — and more importantly, help the humanitarian community be more effective.

As always, constructive comments are welcome.

Benkler, Y. 2002. Coase’s Penguin, or, Linux and “The Nature of the Firm.” Yale Law Journal 112 (3):369–446.
Christensen, C. M., and M. E. Raynor. 2003. The Innovator’s Solution: Creating and Sustaining Successful Growth. Boston: Harvard Business School Publishing.
Clarke, K. C. 2001. Getting started with geographic information systems 3rd ed. Upper Saddle River, N.J: Prentice Hall.
Kelmelis, J. A. ;, L. Schwartz, C. Christian, M. Crawford, and D. King. 2006. Use of Geographic Information in Response to the Sumatra-Andaman Earthquake and Indian Ocean Tsunami of December 26, 2004. Photogrammetric Engineering and Remote Sensing 72:862–876.
National Research Council. 2007. Successful response starts with a map: improving geospatial support for disaster management. Washington, D.C: National Academies Press.
O’Reilly, T. 2007. What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software. Communications & Strategies 1:17. http://ssrn.com/abstract=1008839 (last accessed 22 January 2013).
Verjee, F. 2007. An assessment of the utility of GIS-based analysis to support the coordination of humanitarian assistance. http://pqdtopen.proquest.com/#viewpdf?dispub=3297449 (last accessed 7 March 2013).
von Hippel, E. 2005. Open Source Software Projects as User Innovation Networks. In Perspectives on Free and Open Source Software, eds. J. Feller, B. Fitzgerald, S. A. Hissam, and K. R. Lakhani, 267–278. Cambridge: MIT Press http://mitpress.mit.edu/books/perspectives-free-and-open-source-software.
Wood, W. B. 2000. Complex emergency response planning and coordination: Potential GIS applications. Geopolitics 5 (1):19–36. http://www.tandfonline.com/doi/abs/10.1080/14650040008407665 (last accessed 7 March 2013).
Print Friendly, PDF & Email

USGIF Achievement Award

One of the interesting things about the “Imagery to the Crowd” projects has been the positive feedback we have received from a range of different communities. Ultimately we built the process from a belief that free and open geographic data could support the effective provision of humanitarian assistance, and that the power of open source software and organizations were the key to doing this efficiently.

Our goal with Imagery to the Crowd is to provide a catalyst, in the form of commercial high-resolution satellite imagery, to enable the volunteer mapping community to produce data in areas experiencing (or in risk of) a complex emergency. In many ways I thought of this process as trying to link the “cognitive surplus” of the crowd with the purchasing power of the United States Government, to help humanitarian and development organizations harness the power of geography to do what they already do better.

Somewhat surprisingly, a community outside of the humanitarian sector recognized the potential impact of this process, and the HIU was awarded the United States Geospatial Intelligence Foundation (USGIF) Government Achievement Award 2012 (Press Release, Symposium Daily pdf, Video page). The award was presented at the GeoInt Symposium in Orlando, FL (Oct 7-14 2012). Below is a video of the awards presentation, and includes the Academic and Industry Division winners from this year. The section on the HIU begins around the 7:25 mark.

—————————

—————————–

At the conference I also was on a panel in a “GeoInt Forward” session focused on open source software. This panel was actually the best part of the conference. Typically the first day of the GeoInt Symposium is reserved for the golf event, but this year the organizers included an additional day of panel sessions. In general these sessions were very well attended, and with a full-house of approximately 250 people the session on Open Source Software exceeded my expectations. The session description and other panelists are listed below, and it is clear the defense and intelligence perspective that is GeoInt, but it was an interesting group doing work across a range of different applications. I tried to provide a bit of balance and discussed the philosphical approach to open source, and its potential as an organizing principle for organizations. The Imagery to the Crowd project is built on an a cloud-hosted open source geographic computing infrastructure, so I could speak to the reality of this system. It seems that the coming budget austerity has generated significant interest in open source, and now could be golden opportunity.

From the conference proceedings:
“Open Source Software (OSS) has moved from being a backroom, developers-only domain to a frontline component inside key military capabilities. OSS isn’t doing everything—yet—but it is slowly commoditizing key strategic parts of geospatial infrastructure, from operating systems to databases to applications. In this session, key government program managers will discuss where and how they see OSS moving to solve warfighter needs, as well as assess the gaps in OSS investment and capabilities.”

Moderator – John Scott, Senior Systems Engineer & Open Tech Lead, RadiantBlue
Panelists
• John Snevely, DCGS Enterprise Steering Group Chair
• Col Stephen Hoogasian, U.S. Air Force, Program Manager, NRO
• Keith Barber, Senior Advisor, Agile Acquisition Strategic Initiative, NGA
• John Marshall, Chief Technology Officer, J2, Joint Staff
• Dan Risacher, Developer Advocate, Office of the Chief Information Officer, DoD
• Josh Campbell, GIS Architect, Office of the Geographer & Global Issues, State Department

Reference Cited:

Shirky, C. 2011. Cognitive surplus : how technology makes consumers into collaborators. New York: Penguin Books.

Print Friendly, PDF & Email

Imagery to the Crowd, ICCM 2012

Here is my ignite talk on the “Imagery to the Crowd” project from the International Conference on Crisis Mapping (ICCM 2012). I’ve attended each of the four ICCM conferences (Cleveland, Boston, Geneva, Washington DC). They have been a great way to understand the organizations that comprise the humanitarian community, and more importantly, meet the individuals who power those organizations. It was exciting to present on our work at the HIU, and contribute back to the Crisis Mapping community.

All of the Ignite talk videos are available at the Crisis Mappers Website (lineup .pdf) and collectively they represent a solid cross-section of the field. At the macro-level, I believe the story continues to be about the integration of these new tools and methodologies into established humanitarian practices. The toolkits are stabilizing (crowdsourcing, structured data collection using SMS, volunteer networks, open geographic data and mapping, social media data mining) and are being adopted by the major humanitarian organizations. While I am partial towards crowdsource mapping, the Digital Humanitarian Network and the UN OCHA Humanitarian eXchange Language (HXL) are two other exciting projects.

Print Friendly, PDF & Email

Unifying Illustrator, TileMill / CartoCSS, and GeoServer

With the release of TileMill 0.10.0, there are a series of new compositing operations available within the CartoCSS language and rendering engine. A brief review of these features seems to open up a world of new potential.

However, I have a problem. I work in an organization where our primary product is a hard-copy map. As we evolve our product line, our challenge is to produce digital, interactive, web-enabled versions of our hard-copy map content and maintain a high-level of cartographic goodness. Our cartographic team works in Adobe Illustrator, and are quite proficient in it.

The problem we face is having to do cartographic work twice in order to switch between Illustrator and the Web. There has to be a better way. We are currently using TileMill for a significant amount of our web rendering, but also intend to transition to GeoServer. This means we have .AI, CartoCSS and .SLD in the mix.

We need to keep Illustrator as our foundation for creating content, so what is the best option to extend to the web? First is the problem of converting from .AI to CartoCSS, specifically the conversion of graphic styles for each Illustrator layer to its CartoCSS equivalent. I’ve never heard of a converter tool for this and am interested to know if anyone (@opengeo @ortelius @mapbox @ericg @tmcw @springmeyer @kelso @mattpriour…anyone at Adobe) has an idea if it is feasible or the amount of effort it would take.

Second is the problem of having to convert .AI to both CartoCSS for TileMill and SLD for GeoServer. The best option would be to have GeoServer consume CartoCSS natively, that way offline tiles and web services could maintain the same cartographic styling.

I posed similar questions on Twitter to @cageyjames and @spara, and @spara’s reply got me thinking whether I was looking at this question the wrong way. Is this too old school to be considering tools like this, so I wanted to know if anyone else had been thinking about it.

Thankfully @mattpriour replied that work was already being planned at OpenGeo to implement CartoCSS for GeoServer. And @godwinsgo also replied that GeoServer already has some form of CSS style rendering. So it looks like the CartoCSS in GeoServer has a chance of being completed, that just leaves the .AI to CartoCSS conversion.

Anyone have any thoughts?

Print Friendly, PDF & Email