Ghost Ecosystems
Bubbling this old post to the top given some recent chatter around using traditional indicators of urban decay — trash, graffiti, boarded-up windows — to make inferences using predictive modeling (and, of course, AI). As a system for understanding future municipal resource deployment for spot-fixes and infrastructural interventions — and decisively not to resurrect discredited theories suggesting these indicators encourage crime — this makes a lot of sense (enough for me to spend several years of my professional life on it).
But as a wider lens into the systemic causes of decay and ultimately ecosystem collapse, like of an entire town or species as the post below jauntily tours, it misses the forest fire for a few fallen trees. AI of course is being used in many different scientific domains to map deeper foundational, existential threats, but given where we are in the hype curve with LLMs there’s a lot of money (and attention and compute cycles and energy consumption) going into more surface-level predictive tech. This is probably because it’s a lot easier to fill a pothole than move a city off of fossil fuels.
Speaking of fossils, here’s the old post.
I teach a course at CU Denver on urban technologies where, to the bewilderment of most students, we begin by studying a decrepit urban typology somewhat unique to the American West: ghost towns. Where students are expecting robot cars and sparkling sci-fi skylines they get depopulated ruins and crumbling foundations. It takes several sessions before students appreciate why we start this way. The afterlife of towns and cities exposes quite a bit about why they were created, what assumptions they were built upon, and what larger systems they are enmeshed in. If these towns are ghosts, how exactly did they die?

Shovel Ready
Heading into winter late last year we were told that it was going to be one for the record books. And so it has been. Temperatures have yet to go into minus territory and there are towns in Texas with more snowfall than we’ve had. It’s downright bizarre.
But I’m a believer in meteorological karma. Sure, we’re trending way behind average snowfall to date. But that doesn’t mean Old Man Winter can’t go for a late game Hail Mary. I’ll put away the shovel in June.
That’s basically the philosophy of preparedness behind a slew of winter-focused applications created by the City of Chicago over the last weeks at chicagoshovels.org.
It’s all about scales of sharing, really. Last year’s blizzard showed a side of our city rarely talked about: authentic neighborliness. Chicagoans came to each others’ aid, made friends on stranded public transit, and generally bonded in the face of potential calamity. The idea behind Chicago Shovels is to facilitate this latent drive to be good neighbors, to offer tools for sharing in the common experience of a heavy snowfall.
The sharing extends to the code itself. Civic-minded volunteers came together to build parts of Chicago Shovels and some of the code itself was shared via a Code for America-developed project in Boston. And, of course, we’re sharing what we’ve built on our Github account. Cross-municipality, open source development is the way forward.
Many different threads of Mayor Emanuel’s technology mandate are bound together in Chicago Shovels.
Plow Tracker, the first app to launch and certainly the most popular, is a good case study in open data for transparency and accountability. While I talk a lot about open data as a driver of economic development and as analytics fodder, the lesson from Plow Tracker’s launch — and the record-shattering traffic it drove to the city’s website — is that we shouldn’t forget that the ability to peer into the workings of government is the first and possibly most important function of open data.
The Tracker is a good illustration of our open data initiatives: more information is always better than less. If there are patterns to be found, they will be. And no matter what they are, such analysis leads to a more efficient city government.
Plow Tracker is only on during storms, of course. It shows where plows are in real-time with a bit of information as to the city “asset” you are looking at. This is normally salt-spreading plows, but in bigger snow events can include garbage trucks with “quick-hitch” plows attached and even other city vehicles outfitted to plow. As Chicagoist pointed out, watching the map can remind you of a certain popular video game from the 1980’s.
Feedback from the public, coverage in the press, and inquiries from other cities has been overwhelmingly positive. Many have asked for increased functionality, such as a visualization of what streets are cleared. This is tough, as we do not have real-time data on the status of city streets, except what can be visually inspected via cameras and the plow drivers themselves.
Undaunted, the team at Open City took the Plow Tracker data and created Clear Streets. Where Plow Tracker shows where the plows are, Clear Streets shows you where they have been. (If we’re keeping with the gaming analogies, this is to Etch-a-Sketch what the Tracker is to Pac-Man.)
Clear Streets is obviously useful and a great example of the ecosystem of civic developers that are growing on the periphery of government thanks to open data. And with Chicago’s digital startups reaching critical mass and real attention from venture firms, the city is doing its part in nurturing “civic startups” like Open City. (Here’s a clip of the Open City crew and me discussing all this on WTTW’s Chicago Tonight.)
A last note on (and lesson from) the Tracker: context is key. The little text blurb above the map is really crucial to understanding what you are looking at. As an example, sometimes plows are deployed before it starts snowing for preemptive salting of bridge decks. If you did not have this information it would be difficult to rationalize the placement of plows. It’s a lesson for open data in general. The more data, especially real-time data, the more context matters.

Adopt-a-Sidewalk
The site’s most recently launched app, Adopt-a-Sidewalk, represents the original idea for Chicago Shovels.
Last fall, as Chicago was preparing to become a 2012 Code for America city, we learned about a side project from the Code fellows in Boston. Early in 2011 they had arrived to work on a project with Boston schools but were met with a blizzard. So they built Adopt-a-Hydrant. This idea was to encourage residents to “claim” fire hydrants for shoveling out during the winter. Simple, smart, the right thing to do. And the code was open source.
So we took it with the idea of creating something similar but focused on the public way. We thought we could go a bit bigger than hydrants. See, the Chicago Municipal Code requires residents and businesses to shovel the sidewalk in front of their property. So why not allow them to claim it or claim someone else’s — or ask for help? Claim a parcel, mark it as cleared, track your achievements.
Alas, winter in Chicago has long been associated with “claiming” parts of the public way. Adopt-a-Sidewalk attempts to capitalize on this impulse for the good of pedestrians, minus the lawn furniture and household detritus. (And we’re not the only ones trying to expand the definition of wintertime dibs to the sidewalk itself.)
Again, this has been one weird winter. Snow is scarce and temps are routinely above 40. If it ever does snow again, though, Adopt-a-Sidewalk is ready to promote community responsibility and actual sharing. We partnered with local startup OhSoWe to integrate neighborhood-based sharing into Adopt-a-Sidewalk. Locate your sidewalk — or a parcel you’d like to help out on — and instantly see who around you is willing to lend shovels, salt, even a snow blower.
It’s been noted that the City creating its own apps is a bit of a departure from our data-centric approach to date. (Noted, I might add, in the most strenuous way, with real constructive criticism from engaged residents.)
The truth is that having Mother Nature on the critical path to deployment is a tough, stressful thing. (She’s neither agile nor a fan of the Gannt.) We knew snow was coming and we knew we needed Plow Tracker up for the first major storm. Launching an app was something that could not slip. Adopt-a-Sidewalk, while built with volunteer assistance, was partially an effort at proving that municipal code sharing is real and viable. Both of these builds demonstrate that the City will create apps when there are reasons to do so. But that in no way detracts from our belief that the community and the marketplace are the sources of real innovation that come from Chicago’s open data.
And we have that too. Chicago Shovels’ last major app category showcases community-built applications. The two most useful are actually wintertime reworkings of earlier incarnations.
Last year civic über-developer Scott Robbin built SweepAround.us, an app for alerting residents the night before the City would be sweeping streets in their area so they could move their cars from the street (avoiding a ticket). This was the perfect app for tweaking to accommodate a system for alerts about the City’s 2″ Snow Parking Ban. SweepAround.us became 2inch.es.
Similarly, Robbin’s wildly popular wasmycartowed.com was updated to include automobile relocations due to snow emergencies.
A slew of winter-related resources round out the site, including a number of winter-related apps from last year’s Apps for Metro Chicago competition, information on how to become part of the City’s official volunteer “Snow Corps”, one-click 311 request submission, FAQ’s, and subscription to Notify Chicago alerts.
Chicago Shovels is the city’s best example to date of the value of open data. Transparency and accountability (Plow Tracker), reuse and sharing (Adopt-a-Sidewalk), business creation (Clear Streets), efficiency and ease-of-use (2inch.es and wasmycartowed.com) — these are the outcomes of a policy of exposing the vital signs of the city.
Now if it would only snow — and stick.
Open data in Chicago: progress and direction
In a wonderfully comprehensive overview of Government 2.0 in 2011 up at the O’Reilly Radar blog Alex Howard highlights “going local” as one of the defining trends of the year.
All around the country, pockets of innovation and creativity could be found, as “doing more with less” became a familiar mantra in many councils and state houses.
That’s certainly been the case in the seven-and-a-half months since Mayor Emanuel took the helm in Chicago. Doing more with less has been directly tied to initiatives around data and the implications they have had for real change of government processes, business creation, and urban policy. I’d like to outline what’s been accomplished, where we’re headed and, importantly, why it matters.
The Emanuel transition report laid out a fairly broad charge for technology in his office.
Set high standards for open, participatory government to involve all Chicagoans.
In asking ourselves why open and participatory mattered, we developed the following four principals. The first two, fairly well-established tenets of open government; the last two, long-term policy rationales for positioning open data as a driver of change.

First, the raw materials. Chicago’s data portal, which was established under the previous administration, finally got a workout. It currently hosts 271 data sets with over 20 million rows of data, many updated nightly. Since May 16 the portal has been viewed over 733,201 times and over 37 million rows of data have been accessed.
But it’s the quality rather than the quantity that’s worth noting. Here’s a sampling of the most accessed data sets.
- TIF Projection Reports
- Building Permits (2006 to present)
- Food Inspections
- Vacant and Abandoned Buildings Reported
- Crimes – 2001 to Present (more block-level crime data than any other city, updated nightly)
Here’s a map view of the installed bike rack data set.
As a start towards full-fledged performance management, we launched cityofchicago.org/performance for tracking most anything that touches a resident: hold time for 311 service requests, time to pavement cave-in repair, graffiti removal and business license acquisition, zoning turnarounds, and similar. Currently there are 43 measurements, updated weekly. Here’s an example for average time for pothole repair.
To be sure, all this data can be inscrutable to residents. (One critic of the effort called it “democracy by spreadsheet”.) But the data is merely a foundation, not meant as a end in itself. As we make the publication of this data part of departments’ standard operating procedure the goal has shifted to creation of tools, internally and in the community, for understanding the data.
As a way of fostering development of useful applications, the City joined its data with Chicago-specific sets from the State of Ilinois, Cook County, and the Chicago Metropolitan Agency for Planning to launch an app development competition. Anyone with an idea and coding chops who used at least one of the City’s sets was eligible for prize money put up by the MacArthur Foundation.
Run by the Metro Chicago Information Center, Apps for Metro Chicago was open for about six months and received over 70 apps covering everything from community engagement to sustainability. (You can find the winners for the various rounds here: Transportation, Community, Grand Challenge.)
Here are some of my favorite apps created from the City’s open data.
- Mi Parque – A bilingual participatory placemaking web and smartphone application that helps residents of Little Village ensure that their new park is maintained as a vibrant safe, open and healthy green space for the community.
- SweepAround.us – Enter your address, receive an email or text message letting you know when the street sweepers are coming to your block so you can move your car and avoid getting a ticket.
- Techno Finder – Consolidated directory of public technology resources in Chicago.
- iFindIt – App for social workers, case managers, providers and residents to provide quick information regarding access to food, shelter and medical care in their area.
The apps were fantastic, but the real output of A4MC was the community of urbanists and coders that came together to create them. In addition to participating in new form of civic engagement, these folks also form the basis of what could be several new “civic startups” (more on which below). At hackdays generously hosted by partners and social events organized around the competition, the community really crystalized — an invaluable asset for the city.

Open data hackday hosted by Google
Beyond fulfilling a promise from the transition report, why is any of this important? The overarching answer is not about technology at all, but about culture-change. Open data and its analysis are the basis of our permission to interject the following questions into policy debate: How can we quantify the subject-matter underlying a given decision? How can we parse the vital signs of our city to guide our policymaking?

The mayor created a new position (unique in any city as far as I know) called Chief Data Officer who, in addition to stewarding the data portal and defining our analytics strategy, is instrumental in promoting data-driven decision-making either by testing processes in the lab or by offering guidance for problem-solving strategies. (Brett Goldstein is our CDO. He is remarkable.)
As we look to 2012, four evolutions of open data guide our efforts.
The City-as-Platform
There are a variety of ways to work with the data in the City’s portal, but the most flexible use comes from accessing it via the official API (application programming interface). Developers can hook into the portal and receive a continuously-updated stream of data without manually refreshing their applications each time changes happen in the feed. This changes the City from a static provider of data to a kind of platform for a application development. It’s a reconceptualization of government not as provider of end user experience (i.e., the app or service itself), but as the provider of the foundation for others to build upon. Think of an operating system’s relationship to the applications that third-party developers create for it.
Consider the CTA’s Bus Tracker and Train Tracker. The CTA doesn’t have a monopoly on providing the experience of learning about transit arrivals. While it does have web apps, it exposes its data via API so that others can build upon it. (See Buster and QuickTrain as examples.) This model is the hybrid of outsourcing and civic engagement and it leads to better experiences for residents. And what institution needs a better user experience all around than government?

But what if all City services were “platform-ized” like this? We’re starting 2012 with the help of Code for America, a fellowship program for web developers in cities. They will be tackling Open 311, a standard for wrapping legacy municipal customer service systems in a framework that turns it too into a platform for straightforward (and third-party) development. The team arrives early in 2012 and will be working all year to create the foundation for an ecosystem of apps that will allow everything from one-snap photo reporting of potholes to customized ward-specific service request dashboards. We can’t wait.
The larger implications of platformizing the City of Chicago are enormous, but the two that we consider most important are the Digital Public Way (which I wrote about recently) and how a platform-centric model of government drives economic development. Bringing us to …
The Rise of Civic Startups
It isn’t just app competitions and civic altruism that prompts developers to create applications from government data. 2011 was the year when it became clear that there’s a new kind of startup ecosystem taking root on the edges of government. Open data is increasingly seen as a foundation for new businesses built using open source technologies, agile development methods, and competitive pricing. High-profile failures of enterprise technology initiatives and the acute budget and resource constraints inside government only make this more appealing.
An example locally is the team behind chicagolobbyists.org. When the City published its lobbyist data this year Paul Baker, Ryan Briones, Derek Eder, Chad Pry, and Nick Rougeux came together and built one of the most usable, well-designed, and outright useful applications on top of any government data. (Another example of this, from some of the same crew, is the stunning Look at Cook budget site.)
But they did not stop there. As the result of a recent ethics ordinance the City released an RFP to create an online lobbyist registration system. The chicagolobbyists.org crew submitted a proposal. Clearly the process was eye-opening. Consider the scenario: a small group of nimble developers with deep subject matter expertise (from their work with the open data) go toe-to-toe with incumbents and enterprise application companies. The promise of expanding the ecosystem of qualified vendors, even changing the skills mix of respondents, is a new driver of the release of City data. (Note I am not part of the review team for this RFP.)
One of the earliest examples of civic startups — maybe the earliest — is homegrown. Adrian Holovaty’s Everyblock grew out of ChicagoCrime.org, which itself was a site built entirely on scraped data about Chicago public safety.
For more on the opportunity for civic startups see Nick Grossman’s excellent presentation. (And let’s not forget the way open data — and truly creative hacker-journalists — are changing the face of news media.)
Predictive Analytics
Of all the reasons for promoting a culture of data-driven decision-making, the promise of using deep analytics and machine learning to help us isolate patterns in enormous data sets is the most important. Open data as a philosophy is easily as much about releasing the floodgates internally in government as it is in availing data to the public. To this end we’re building out a geo-spatial platform to serve as the foundation of a neighborhood-level index of indicators. This large-scale initiative harnesses block- and community-level data for making informed predictions about potential neighborhood outcomes such as foreclosure, joblessness, crime, and blight. Wonkiness aside, the goal is to facilitate policy interventions in the areas of public safety, infrastructure utilization, service delivery, public health and transportation. This is our moonshot and 2012 is its year.
Above, a very granular map from early in the administration isolating food deserts in Chicago (click image for larger). It is being used to inform our efforts at encouraging new fresh food options in our communities. This, scaled way up, is the start of a comprehensive neighborhood predictive model.
Unified Information
The same platform that aggregates information geo-spatially for analytics by definition is a common warehouse for all City data tied to a location. It is, in short, a corollary to our emergency preparedness dashboards at the Office of Emergency Management and Communication (OEMC), a visual, cross-department portal into information for any geographic point or region. This has obvious implications for the day-to-day operations of the City (for instance, predicting and consolidating service requests on a given block from multiple departments).
But it also is meaningful for the public. Dan O’Neil recently wrote a great post on the former Noel State Bank Buiding at 1601. N. Milwaukee. It’s a deep dive into the history of a single place, using all kinds of City and non-City data. What’s most instructive about the post is the difficulty in aggregating all this information and the output of the effort itself: Dan has produced a comprehensive cross-section of a small part of the city. There’s no reason that the City cannot play an important role in unifying and standardizing its information geo-spatially so that a deep dive into a specific point or area is as easy as a Google search. The resource this would provide for urban planning, community organizing and journalism would be invaluable.
There’s more in store for 2012, of course. It’s been an exhilarating year. Thanks to everyone who volunteered time and energy to help us get this far. We’re only just getting started.
The kind of Innovation Chicago is
The economist Edward Glaeser has called Chicago “a city built upon corn in porcine form”. He’s referring to the city’s remarkable 19th century transmutation of the natural bounty of prairie agriculture into a higher value form of commerce, pigs. The innovation necessary for the cold storage and transportation of which would help Chicago become the central node in a nation-spanning rail network. It was the beginning of greatness.
We can effect this transformation again. Our natural bounty today is data, knowledge, and ideas — their “form” the establishment of new businesses and a more livable Chicago.
Today Chicago gets a new mayor and a new administration. I’m proud to be its Chief Technology Officer, a role I took for a single, simple reason. For the last three years I have traveled the world consulting with cities on strategies for making them smarter, more efficient, and more responsive to citizens. Many of these talks and projects were fruitful, but none of them mattered to me personally. None of them, in short, mattered to the city I love most.
The coming of the web, you may recall, was cause for all kinds of pronouncements that we’d move away from each other, tied only by network communications, happily introverted in electronic cocoons. This has not happened (indeed the reverse is happening). If anything the ubiquity of network technologies has proven that place matters. Mobile computing and “checking-in”-style apps are ascendant because we are creatures of place. And my place, the place of four generations of my family, is Chicago. It’s time to focus my effort here.
The transition report published last week is a roadmap for the change the Emanuel administration will undertake. All of the initiatives are exciting and important, but one speaks directly to remaking Chicago as a hub of information that leads to insight and growth.
Set high standards for open, participatory government to involve all Chicagoans
Why do this?
Without access to information, Chicagoans cannot effectively find services, build businesses, or understand how well City government is performing and hold it accountable for results.How will we do this?
The City will post on-line and in easy-to-use formats the information that Chicagoans need most. For example, complete budget documents – currently only retrievable as massive PDF documents – will be available in straightforward and searchable formats. The City’s web site will allow anyone to track and find information on lobbyists and what they are lobbying for as well as which government officials they have lobbied. The City will out-perform the requirements of the Freedom of Information Act and publicly report delays and denials in providing access to public records.The City will also place on-line information about permitting, zoning, and business licenses, including status of applications and requests. And Chicagoans will be asked to participate in Open311, an easy and transparent means for all residents to submit and monitor service requests, such as potholes and broken street lights. Chicagoans will be invited to develop their own “apps” to interpret and use City data in ways that most help the public.
Participatory government isn’t the only use of the wealth of information the city can publish. We intend data-driven decision-making, powered by deep analytics of our services and city vital signs, to be central to the day-to-day business of running Chicago.
A data-centric philosophy is more than transparency and efficiency, too. It is about fostering innovation. Business is built on local resources. Where once we transformed grain into pigs into commodities, we can now provide data that serves as a kind of raw material for new business and new markets.
(One example of this — and proof that the talent to do great things is right here, right now — is the design of a “smart intersection” by the students from George Aye’s Living In A Smart City class at the School of the Art Institute this semester.)
There’s plenty more in store for technology to assist in Chicago’s growth and livability. It’s suffused throughout the transition report: promoting entrepreneurship, increasing access to broadband, treating the street as a platform for interaction itself (more of which on all these in future posts). But the foundation is data. Access is an important first step, followed quickly by the tools and policies for taking action on it.
Chicago knows how to do all this. We’ve been doing it for over a century. We have the talent in the private sector, in academia, and in our non-profits to capitalize on any impetus city government can give. Let’s get going.
What it’s like to match wits with a supercomputer
I spent most of the May 1997 rematch between chess world champion Garry Kasparov and IBM’s Deep Blue supercomputer sitting in a grad school classroom. I think it was Intro to Human-Computer Interaction, ironically enough. The professor projected a clunky Java-powered chess board “webcast” (the term was new, as was the web) so we could follow the match. The pace of chess being deliberative and glacial, it really wasn’t a distraction. Not to mention that, at the time, I didn’t know how to play chess. But I do remember people caring deeply about the outcome. I went to work for IBM the following year.
Deep Blue’s descendant, if not in code or microchips then in the style of its coming-out party, is Watson, a massively parallel assemblage of Power 7 processors and natural language-parsing algorithms. Watson, if you’re not a geek or a game show enthusiast, was the computer that played Ken Jennings and Brad Rutter on Jeopardy Feb. 14-16 of this year. Watson won.
Wednesday of last week I got a chance to play Watson on the Jeopardy set built at our research facility for the show. I did not win.

But I did hold the lead for a time and, in fact, I beat Watson during an unrecorded practice round. Honest!
Jeffrey Plaut of Global Strategy and I were the two human competitors selected to go up against Watson in a demonstration match. We did so at the culmination of a few hours of discussion with leaders from the humanitarian sector on how to expand Watson’s repertoire to put it to work in areas that matter. (More on that in a bit.)
IBM built a complete Jeopardy set for the actual televised match. Sony has lots of experience with this, as Jeopardy often goes on the road. But it’s clearly a hack: TV made the set look a lot bigger that it really is and the show’s producers had to jump through hoops to provide dressing room space and keep the contestants segregated from interacting with IBM’ers (to avoid claims of collusion, I suppose). Ken Jennings has some typically humorous insight on this.
Trebek was long gone, so we had the project manager for Watson host the session I competed in. He’s actually very good, as Watson went through a year of training with past winners and stand-in hosts. I was to play one round of Jeopardy. The rules were the same as the real game and Watson was at full computing capacity, with two exceptions. We were told that we could ring in and then appeal to the audience for help and, most importantly, Watson’s ring-in time was slowed down by a quarter second. The first I took as an insult — if I was going to compete against a computer I was going to do it myself — the second was a blessing.

Standing at the podium is certainly nerve-wracking. There’s a small screen and light pen for scrawling your name and then the buzzer. I stood in Jennings’ spot and it was striking to see how worn the paint was on the buzzer. From sweat? Who knows, but that thing looked like it had been squeezed to death. Contestants can see the clue board and the host, of course, but there’s also a blue bar of light underneath the clues which is triggered manually by a producer once the host finishes reading the last syllable of the clue. This is the most important moment, as ringing in before the blue bar appears locks you out temporarily. Watson had to wait a quarter second at this point and I am convinced it is the only reason we humans were able to get an answer in edgewise.
In a way, this moment is as much human-versus-human as anything. You’re trying to predict exactly when the producer will trigger the go light. Factor in some electrical delay for the plunger and it can be a real crapshoot. This is why past champions perfect their buzzer technique and ring in no matter what. They just assume they will know the answer and be able to retrieve it in the three seconds they are given.
I got a bit of a roll in the category called “Saints Be Praised”. My Catholic upbringing, study in Rome, and fascination with weird forms of martyrhood finally paid dividends. (I also learned after the match that my human competitor was Jewish and largely clueless about the category.) The video above shows me answering a question correctly — something that seems to have shocked my colleagues and the audience. (And I would have disgraced every facet of personal heritage had I messed up a question about an Italian Catholic from Chicago.)

This clue was more interesting as Watson and I both got it wrong. The category was “What are you … chicken?” about chicken-based foods. Maybe my brain was still in Italian mode as I incorrectly responded “Marsala”, but Watson’s answer — “What is sauce?” — was way wrong, categorically so. This is insightful. For one, the answer, “What is Chicken A La King,” if Watson had come across it at all, was likely confusing since “king” can have so many other contexts in natural language. But Watson was confident enough to ring in anyway and its answer was basically a description of what makes Chicken A La King different from regular chicken. Note that the word “sauce” does not exist in the clue. Watson was finishing the sentence.
What’s most important and too-infrequently mentioned is that Watson is not connected to the Internet. And even if it were, because of the puns, word play, and often contorted syntax of Jeopardy clues, Google wouldn’t be very useful anyway. Try searching on the clue above and you’ll get one hit — and that only because we were apparently playing a category that had already been played and logged online by Jeopardy fans. The actual match questions during the Jennings-Rutter match were brand new. The Internet is no lifeline for questions posed in natural language.

At one point I had less than zero (I blew a Daily Double) while Jeff got on a roll asking the audience for help. And the audience was nearly always right. Call it human parallel processing. But if I was going to go down in flames to a computer I was damn sure not going to lose to another bag of carbon and water. I did squeak out a victory with a small “v” — and Watson was even gracious about it.
Thinking back it is interesting to note that nearly all my correct answers were from things I had learned through experience, not book-ingested facts. I would not have known the components of Chicken Tetrazini did I not love to eat it. I would probably not know Mother Cabrini if I didn’t take the L past the Cabrini-Green housing project every day on the way to work. This is the biggest difference between human intelligence and Watson, it seems to me. Watson does learn and make connections between concepts — and this is clearly what makes it so unique — but it does not learn in an embodied way. That is, it does not experience anything. It has no capacity for a fact to be strongly imprinted on it because of physical sensation, or habit, or happenstance — all major factors in human act of learning.
In Watson’s most-discussed screw-up on the actual show, where it answered “Toronto” when given two clues about Chicago’s airports, there’s IBM’s very valid explanation (weak category indicator, cities in the US called Toronto, difficult phrasing), but it was also noted that Watson has never been stuck at O’Hare, as virtually every air traveler has. (The UK-born author of this piece has actually be stranded for so long that he wandered around the airport and learned that it was named for the WWII aviator Butch O’Hare.) Which isn’t to say that a computer could never achieve embodied knowledge, but that’s not where we are now.

But all of it was just icing on the cake. The audience was not there to see me make a fool of myself (though perhaps a few co-workers were). We were there to discuss the practical, socially-relevant applications of Watson’s natural-language processing in fields directly benefiting humanity.
Healthcare is a primary focus. It isn’t a huge leap to see a patient’s own description of what ails him or her as the (vague, weakly-indicating) clue in Jeopardy. Run the matching algorithm against the huge corpus of medical literature and you have a diagnostic aid. This is especially useful in that Watson could provide the physician its confidence level and the logical chain of “evidence” that it used to arrive at the possible diagnoses. Work to create a “Doctor” Watson is well underway.
As interesting to my colleagues and I are applications of Watson to social services, education, and city management. Imagine setting Watson to work on the huge database of past 311 service call requests. We could potentially move beyond interesting visualizations and correlations to more efficient ways to deploy resources. This isn’t about replacing call centers but about enabling them to view 311 requests — a kind of massive, hyperlocal index of what a city cares about — as an interconnected system of causes and effects. And that’s merely the application most interesting to me. There are dozens of areas to apply Watson, immediately.
The cover story of The Atlantic this month, Mind vs. Machine, is all about humanity’s half-century attempt to create a computer that would pass the Turing Test — which would, in other words, be able to pass itself off as a human, convincingly. (We’re not there yet, though we’ve come tantalizingly close.) Watson does not pass the Turing test, for all sorts of reasons, but the truth is that what we’ve learned from it — what I learned personally in a single round of Jeopardy — is that the closer we get to creating human-like intelligence in a machine, the more finely-nuanced our understanding of our own cognitive faculties becomes. The last mile to true AI will be the most difficult, primarily because we’re simultaneously trying to crack a technical problem and figure out what, in the end, makes human intelligence human.
SXSW panel preview: The City Is A Platform

Time for the annual pilgrimage to Austin for South By Southwest. I’ve been on panels before but, with zero disrespect to previous co-panelists, the one I have currently lined up is going to be really freaking good, maybe the best ever. Here’s detail.
Tuesday, March 16
11:00 AM
Room 9ABC
Austin Convention Center
[Add to my.sxsw.com or sitby.us.]
The panel is a great cross-section of perspective on networked urbanism. We got non-profit, academia, start-up, city government, and faceless mega-corporation (me).
Ben Berkowitz runs SeeClickFix.com, a tool that allows communities to report non-emergency issues to those responsible for the public space. This app has changed the conversation around civic engagement and prompted a number of municipalities to rethink their 311 strategy. Also, NPR likes it.
Assaf Biderman is the Assistant Director of the SENSEable City Lab at MIT. The work from the lab itself is amazing (flying LED robots, trash-tracking, city bikes that are also environmental sensors!), but it also approaches art, having been featured at the Venice Bienalle, Centre Pompidou, and Ars Electronica. Also, he’s the suavest panel member.
Dustin Haisler is the CIO and Administrative Judge for the City of Manor, Texas. Words can’t do justice to the amazingness that is Dustin. But a link might. He’s just completely rewritten the rules of city governance and engagement. Also, he’s younger than you.
Jen Masengarb is an Education Specialist at the Chicago Architecture Foundation where she educates the public about cities and the built environment. Jen gets what it takes to translate the urban world for its citizens and is a template for how we might do so in our second cities of data. Also, she’s the femalest member of the panel.
And then, me, of course. I’m just stewarding the awesome above.
We’re going to tackle three question areas, broadly.
- What is the physically-built urban environment’s relationship to the digital environment that is being built atop it? Put another way, is there a mandate for information architects to be thinking as critically about cities as they do about websites?
- What is the design imperative: how do we train the makers of today to think about the city as a platform?
- What is the role of citizens in this design? This is different than focus groups and user studies. Citizens shape the machine that is the city in completely indirect and informal ways.
If you’ll be in Austin for South by Southwest — and you’re hanging around until the last day of Interactive — I’ll bet you a taco and a beer you’ll learn something from this panel.
Recap post and podcast to follow.
Cooking as ancestor worship
My wife comes from a long line of exemplary cooks. She works the kitchen by instinct, mixing, matching, improvising. She’s economical, mindful of but not enslaved to kids’ eating schedules, and treats recipes as inspiration rather than prescription. When life gives her lemons she makes lemon meringue tart.
This is no way describes my approach to cooking.
For one, I have no sense of proportion or timing. When I get it in my head that I am going to cook I can rarely tolerate not cooking — from scratch — every last damn thing. Call it a sense of cheating. If it can be made rather than poured from a can, I want to make it. It’s such a problem that there’s even a mild irritation that I can’t actually provision the milk or beef or rare vegetables from my backyard.
Which of course means that dinner is rarely served before 10 PM on the nights I cook.
Normally this little mania takes the form of Italian cooking, specifically Southern Italian, usually from the region of Basilicata. Lots of reasons for this, mostly having to do with family heritage (copiously covered previously).
Last week, we made ravioli, with a twist. The particular recipe comes from my great-grandparents’ hometown of Barile, a village long-steeped in Albanian tradition. Ravioli alla albanese has been described as “dessert and dinner all in one” because the ricotta filling, called gyuz, is sweetened with sugar and cinnamon. Full ingredients and recipe.
The ricotta was fun and surprisingly easy. One gallon of whole milk plus one quart of buttermilk, heated to 175° until the curds start to separate. You then ladle the curds into cheesecloth and drain. Add your chosen seasoning and the fluffy warm filling is ready to go.
Hand-making ravioli, on the other hand, was an extraordinarily laborious undertaking. We’d made pasta from scratch before — with an electric machine — but that won’t do for the sheets that form the ravioli pillows.
So we borrowed a friend’s hand-crank pasta machine. Problem was, it had no clamp to secure it to the counter which, if you’ve ever tried forcing dough through a tiny metal slit, was no fun at all.
Well, that’s not entirely true. Getting it right was immensely satisfying.
Once you have the sheets you use this fabulous little slicing/pinching wheel specifically for ravioli. This gives you the pillow “casing” into which you put the ricotta. You have to make sure the edge seals firmly as you will shortly be plopping the ravioli in boiling water and don’t want filling exploding everywhere.

The recipe calls for meatballs and tomato sauce as accompaniment and here is where the from-scratch obsession shows its ugly underside. These sides ended up being two separate meals entirely. For one, the fist-sized meatballs came from a Neapolitan recipe that includes grated Parmesan, garlic, basil, oregano, and nutmeg (vetoed by wife).
The sauce, however, wasn’t really a sauce but a ragù, basically an entire meal in a pot.
You pound pork shoulder flat, line it with pancetta, then fill it with a yummy payload of garlic, parsley, chili powder (hallmark of this region), nutmeg (vetoed), and pecorino or parmesan. Add white wine and canned whole tomatoes and simmer forever.
In a nutshell what you get after simmering this pork bomb is a sauce for pasta and a second meal, which we didn’t not even attempt to eat on the night in question.
All in all, a fantastic experience, though perhaps not one best-suited for a weeknight. Let me know if you’d like detail on the ingredients or process.
Full photo set here.
See also Spaghetti All’assassino and Lucanian risotto.
Off-world, a party turns 10
Ten years ago my wife and I had just moved to Chicago. Kidless, dual income, cool new top floor condo. We threw a Christmas party for the few people we knew. It was fairly low-key: appetizers, beer and wine only, and holiday tunes softly played. The invitation even had an end time on it.
It isn’t like that anymore.
The get-together has become something of a spectacle, an entire year’s worth of creative energy throttled into a single night that reminds us of a youth I don’t think my wife or I ever actually had. And it’s kidless once more, having evolved into a house-sized version of stays-in-Vegas that the children would surely be embarrassed by later in life if they had the memories. (And will, thanks to this post, hundreds of photos, a full video feed and the Google bots. Sorry, kids.)

The parties early on never had themes, but eventually we started giving away favors and that led to light theming, usually holiday-related (e.g., “I Think They Spiked the Nog” and “Lords a-Leaping”.) But themes are a gateway drug and soon enough we were in full-blown obsessive-compulsion about every last detail conforming to the chosen motif.
Last year, the theme was “Around the World,” celebrating travel of all kinds and lending itself handily to silly tie-ins. This year’s theme — Out Of This World — seems almost predetermined given the re-use it made possible of certain globe decor from last year, but also because of what a space nerd I am. (And yes, it lends itself to a world “trilogy”, more on which later.)

The favor proved challenging as we had designed ourselves into a bit of a corner last year by dumping CDs in favor of USB keys. The consensus opinion (meaning my wife’s) was that people really didn’t use the key drives — leading me to question our choice of friends, frankly — and the decision was to go back to CDs.
This led to what I thought was a fantastic idea. I’d build an armillary sphere with the compact disc as the celestial equator! Wait, come back. If I admit that it would have taken months and every shred of sanity I have to actually make them, you have to admit it would have looked amazing.
Next idea: ringed planet. It was a contentious design process, honestly, but in the end it yielded something great. The CD (two actually) formed the rings, a styrofoam ball sliced in half and glittered formed the planet. This set like a garnish on a mini-martini glass which itself was set atop a coaster that was our holiday card (photo of kids with greeting). Initially Robyn suggested the card be a flag planted atop the planet. Which of course is silly, given that Saturn is a gas giant and you can’t plant flags on it. Sheesh! (This kind of thinking led to a chandelier planet arrangement that was far from accurate.) Our fantastic nanny, Ellen Gallerini, and her business partner — the Glitter Girlz — bore the brunt of the assembly work. Amazing, huh?

But the real stroke of genius came from Robyn: the glasses were filled with Mentos and the entire favor display was backed with 2-liter bottles of Diet Coke. Blastoff! (If you’re unaware of the particular physics involved here, have a look.) Not sure if anyone tried this, but in keeping with our tradition of home-wrecking favors we have reports that the glitter got into and all over virtually everything it touched. I can’t imagine the discs were actually playable. (Which is OK: you can download it here.)
Food and drink stayed on-theme, my particular favorite being the red velvet frosted cake balls peddled as moon rocks. The custom drink list, bane of our hired bartenders and the ultimate scapegoat for much that happens, was equally tasty. Choice selections included the Tang-tini (Tang and blood orange martini), Fly Me To the Moon (Passion Fruit Vodka and Prosecco), and the Black Hole (Espresso Martini). Bottleable quanities of each of these drinks were sucked from our carpet by Stanley Steemer a few days after the party.

A note on the bartender. Serving drinks for this party is pure misery. In an effort to encourage a flow through the house, we put the mixed drinks and bartender in the basement. This meant he was subjected to at least 7 hours of aural and visual assault in a very limited space. Add drunk revelers and dancing bodies. Stir.
Well, we’ve solved this problem and his name is Matt Vogel, aka “Fingers”. We didn’t know the reason for this nickname until he showed up. Fingers, you see, has only one hand. Fingers insisted we call him such and I protested until he produced a business card with “Fingers” on it. You can imagine our thoughts when a one-handed guy showed up for what is a tough assignment for a barkeep with four arms. But here’s the thing: Fingers was amazing. He kept pace, didn’t complain, and stayed late — all with a great disposition.
The theme is fun. The food, drink and decor are festive. But the genetic mutation that’s most responsible for the party’s evolution is what happens in the basement. To quote a friend, “I don’t even mess with the first floor anymore.” Let’s go there.

Basically the lower level is just one big media generation machine. “Photobooth”, live video feed, lots of roving photo/video cameras, a closed-circuit feed to two projectors, two iSights snapping at regular intervals, and a recording of the audio from the DJ booth ensure that it is well-covered. Good thing too; there are a lot of cute boots down there.
It’s a massive effort. We move every last shred of furniture and decor out of what is a very functional and much-used basement (our family life routine is also effectively moved out), then load in a forklift’s worth of plywood to construct what becomes the Nightclub on Henderson Street.
We amped up the lighting this year, figuratively and literally, adding three high-powered spots, stage washes, and a physical control panel to the full roster of DJ spots, LED cans, strobes, projectors, and laser. This is all due to a guy who wasn’t actually at the party. Tom Herlihy, visuals expert and total lighting nerd, loaned all the equipment, trained a totally capable assistant, Chris Gansen, and then decamped for Kabul, Afghanistan for work. And this was the reason for the live video feed. Tom caught parts of the party in the Dubai and London airports. Totally worth it.
The DJ booth is simply a beast. Originally constructed to accommodate two people, enlarged to fit four, and then, this year, completely rebuilt. The 2009 version situated the three DJ’s more comfortably while giving the AV control a kind of crow’s nest above it all and, importantly, providing a dance platform behind the DJ surface, since that’s where we found people pooled anyway.
Clearly raised areas attracted people in past years, so we build two dance platforms out in the crowd. These were sturdy and festooned with instructions that we figured even the drunkest partygoers would understand.

The DJ setup this year exceeded all past. The unbelievable Jesse Kriss returned (this time from Seattle rather than Boston) and provided the real turntable chops. He was the master of ceremonies for all audio, messing with whatever Joey and I were pumping out via Ableton and Traktor. We also had a Korg KAOSS pad (a tactile/visual effects and loop controller) which were totally smitten with mere seconds after hooking it up.

We played for over eight hours, covering a serious range of tunes. Jesse, Joey, and I really seemed to click this year, handing off more smoothly than catastrophically most of the time. (I stress most of the time. See custom drink menu, above.) The floor was packed with dancers for hours. The apotheosis of the party, truly.
Jesse’s fantastic beginning set is excerpted here with full tracklisting.
The built-in downfall of the party, it seems to my wife and I, is the ever-more-difficult challenge of making the spectacle that much bigger year-on-year. But that’s a problem for the future; we hit the mark this year. Inspired by Daft Punk inspired by Tron we constructed three glowing jackets of electroluminescent wire for the DJ crew. The nerdfest began about four hours in and was met with a solid wall of cheering.

The jackets were a bit of a pain in the ass, as we had to affix the somewhat delicate EL wire with tiny safety pins from inside the jacket. But my god was it worth it. Everyone wanted to wear them, which was fine by us as they were hot as hell. Biggest upside: wearing a jacket of copper wire with electricity coursing through it was an effective deterrent to me taking my shirt off, something that has regrettably become a de facto tradition at the party. Not this year!
Though there’s no end time on the invitation anymore, there’s something about this party that demands a discrete finale rather than fading out with hangers-on. Last year this finale came courtesy of the Chicago Police Department. We escaped that this year, somehow. (How we weren’t charged with “operating a public place of amusement without a license” is beyond me.)
This year the ending came via a small explosion.
Piecing together exactly what happened was a massive chore taking weeks and all kinds of CSI-style cross-referencing of testimonial and media timestamps. The folks still there at 3:45 AM said later on that the power cut out. Apparently I rushed to the circuit breaker in the back bedroom to check this and in the process intruded on two sleeping guests who had called it quits.
But that wasn’t it. Couldn’t have been. The recording of the night proved that power remained as it continued for hours uninterrupted after the music ended. We were pulling from four separate circuits in the basement, having learned our lesson from the strobes in previous years.
The next morning the only thing people recalled was me saying “Party’s over. Get the hell out!” But my laptop was completely dead. Dead, but seemingly unmolested. No drink spills apparent anywhere. This is not, however, what Apple repair ultimately said. “Extensive internal liquid damage” was the diagnosis. As best we can tell, liquid seeped in through the Superdrive bay slot on the right side of the laptop and then destroyed everything but the hard drive and wireless radios. No idea how that could have happened.
And that’s how the party ended.

But how it came to be is more important. Dozens of people gave dozens of hours to realize such a thing. We’ve mentioned Tom Herlihy, Jesse Kriss, Chris Gansen and my brother Michael, but that leaves out Justin Bowersock, Alyson Higgins, Cathy Brennan, Heidi and Pat Potter, David Balcom, Mike Bloebaum, Ricky Thorpe, Michelle Simpson, Tom Alter, Ellen Gallerini, Jodie Deschler and others who absolutely made it happen. I’ve said it before and I don’t give a damn if I say it again. This your party too. THANK YOU.
The experience is extraordinary, for sure, but so is the toll it takes on the family to bring about. The half-jokes Robyn and I made about this being our last party during the run-up became less than half as the party approached. But we recognize that we can’t just end something like this without warning. Too many people have too good a time to do that.
So I’ll ask you, dear reader, if you’ve been around the world and off the world, what’s the only thing left to do to the world?
——–
Here’s the full photo gallery. See also Chris Gansen’s great shots.
Curious about past parties?
Read about 2008 (photos!), 2007 (photos!) and 2006.
Or listen in: mixes from 2008, 2007, 2006 and 2005.
Our second city
Recently I was asked by WBEZ, the Chicago NPR affiliate, to write an essay on a topic or trend from 2009 that I would like to see carried forward “from here on out”.
What I wrote was a condensation of a year of conferences and talks informed by IBM’s Smarter Cities perspective — all with a Chicago bent. It was an interesting and ultimately enjoyable exercise, whittling down a tough subject into something to be read aloud. I’m grateful to NPR for the opportunity and their collaborative editing.
Here’s the link to the transcript and audio on NPR. The actual broadcast, I’m told, will be during All Things Considered on 1/1/2010. Pretty sure the broadcast is Chicago-only.
Here is the original essay, which gives a little more context to my screed.

This past year offered Chicagoans some unique opportunities to consider our collective identity as a city. We looked forward, dreaming of how we might remake the urban space to host the world and its Olympians in 2016. We looked backward, celebrating Burnham’s 100-year-old vision for what the city might become and, perhaps more interestingly, what it never did become. These two events both asked Chicagoans to imagine a city that did not exist, to grapple with a series of what-ifs about the built environment.
And yet, there’s another city — equally intangible — being built even as we move on from the Olympic decision and unrealized bold plans. It is a literal second city, built right atop our architecture of buildings, streets, and sewers. This is the city of data — every bit as complex and vital as our physical infrastructure, but as seemingly unreal (and unrealized) as the what-might-have-beens of Burnham’s Plan and Chicago 2016.
But what is a city of data and why should Chicago care about being one?
IT research firm Gartner notes that by the end 2012, 20% of the (non-video) data on the Internet will originate not from humans but from sensors in the environment. If your eyes just glazed over, let’s look at this from a different angle: if Gartner is right, for every four text messages that a pedestrian sends, the sidewalk she is walking on while doing so is also sending an equivalent amount of data. The city itself is becoming part of the Internet.
This is happening already. The city is increasingly instrumented; nearly everything today can be monitored, measured, and sensed. There are billions of processors embedded in everything from structural girders to running shoes. Millions of radio frequency identification tags turn inanimate objects into addressable resources. The city is immersed in a environment of data continuously built and rebuilt from the lived experiences of its occupants. And yet, this information architecture is hardly planned, much less dreamt about, or celebrated.
Consider the intersection of Michigan Avenue and Congress Parkway, what Burnham envisioned as a grand pedestrian-friendly concourse leading westward towards a towering civic center and eastward to the lakefront. This was never built, of course. (The circle interchange is our civic center, alas.) And yet there’s another built world, equally intangible, an infrastructure of data, overlaid on this intersection.
- Three students surf the web thanks to an open WIFI cloud that leaks out of a local hotel lobby.
- Several GPS units in cars all update with detail about the intersection as they approach.
- Sensors embedded in the water main below the street register a blockage.
- Closed-circuit cameras in three different shops capture the same window shopper as he moves down Michigan towards Randolph.
- An exhausted cyclist’s bike computer uploads his location and energy expenditure as he stops to use his iPhone to log into a Zipcar waiting to take him home.
- The city 311 database is populated with 7 different service requests from the surrounding area, coming from phone and e-mail.
- Taxis criss-cross the intersection as their fare data trails are logged locally and broadcast to dispatch.
- Four different people tweet from different perspectives on the same news crawl that moves across a building’s frontage.
- A bus stops to pick up passengers and bathes them in the glow of the full-color video screen running along its side.
- RFID chips on pallets loaded into building docks beneath the street respond to transducers in the receivers’ doorways.
And on and on. The examples are commonplace, but together they form an infrastructure — or superstructure — a second set of interactions, invisible or barely visible, atop the interactions that we plan for and currently build for. Proprietary, public, local, remote — all manner of data continuously permeates the streetscape. And yet we scarcely think of how it plays a part in the city that we’re building, the city that we want to become.
We don’t dwell on physical city infrastructure much either — unless we’re momentarily captivated by an architectural facade or, more commonly, inconvenienced by some lapse in the expected service. And yet. We’re the city that defines architectural styles for the world, that elevates an urban planner to local celebrity, that engages in a heated debate about the merits of remaking ourselves for the Olympics. From here on out why should we not apply such passion to the next wave of digital infrastructure? It is a decision not to be made lightly or as a thought exercise: how we design our city of information is as vital to quality of life as streets, schools, and job opportunity.
Dan Hill, a leading urban designer in matters digital, notes that we often think of the information landscape like street furniture and road signs, as adornment or a supplement to the physical environment. But fissures in a city’s data infrastructure are as consequential as potholes. They are structural failings of a city at the most basic level, in a way that a busted piece of street art would never be.
Think of cell phone outages — “dark zones” — as potholes in the urban information landscape. Or consider GPS brownouts, such as cause error in bus-tracking when the CTA enters the satellite-blocking skyscraper canyon of the Loop. But these examples are minor compared to the real issue before us: how do we proactively build a city of information that is inclusionary, robust but flexible, and reflective of a city’s unique character?
Our built structures — physical and digital — are manifestations of the patterns of human life in a city. They encode our desires, our needs, and our hopes. In some cases the permanence of the built environment inhibits or works at cross-purposes to these goals. (Think of expressways as barriers to the way people move about neighborhoods.)
We have a unique opportunity to ensure that our digital infrastructure avoids the mistakes of our physical infrastructure, to make Chicago the envy not just of building architects but of information architects.
I suggest two ways to start. To engage in a dialogue about this new built environment — such as we did collectively this summer — our city planners and citizenry need to be at least as conversant with the language of information architecture as we are, at a basic level, about physical architecture. Call it an aesthetics of data. This is as much a matter of becoming aware of what’s happening around us, of figuring out the most elegant ways of making the unseen felt, of thinking of our urban spaces as I described the interactions at Michigan and Congress.
Second, we need to recognize that, while the power of information is the power to connect, every linkage made represents a connection not made or, at worst, a disconnection. (Think again of the unintended effects of expressways on neighborhood mobility.) Our plan for a networked urbanism should seek above all to be maximally enfranchising, lowering barriers to commerce and community.
We must take up this mantle and be active participants in the design of this networked urbanism. We must make our voice heard. From educating our elected representatives about the opportunities before us, to encouraging our youth — who increasingly live in a world of data — to think critically about their role in the urban fabric, we must embrace this challenge with the same passion embodied in our historical tradition of remarkable plans for Chicago.
[This essay is cross-posted at the Building a Smarter Planet blog.]
‘Bout damn time
It’s out. Jesse’s posted his 2009 Inauguration Mix.
Superb track curation, tasty scratches, all mixed live like a guy who knows what the hell he’s doing.

My contribution? I am a spectral fraction of the crowd-cheering waveform from the Grant Park speech layered in. So, yeah, I’ll be demanding royalties.
Pull it down and turn it up.


























