Author Archives: Javaun

About Javaun

32 and married to a wonderful woman, my life is meeting new freinds and recreating outdoors. Once they put a screw in my ankle, I'll be back to hiking and trail running and mountain biking.

NPR API Course Now Live on Codecademy

On Wednesday, Codecademy launched a full track of lessons on web APIs, and NPR was a launch partner with a course on the NPR API.  There was a bit of media coverage.

This entailed weeks of nights and weekend work but it was a ton of fun. For me, it started last November when I met Codecademy’s Sasha Laundy at DC Week. When she said they were about to pilot a track on APIs, I jumped at the chance.

I’ve been a big fan of what Codecademy is trying to do since it launched.  The world has a dire shortage of geeks, and they’re trying to make more. I’ve heard a few people scoff that you can’t learn to code just by spending 30 minutes here and there on Codecademy. To be clear — you can become a good programmer *many* ways as long as you’re willing to put in the time and struggle through difficult problems.

But even my argument misses the main point. By removing all of the friction in working with a new technology, Codecademy is betting that a lot of folks who would never have tried programming in the first place will give it a shot. And some of them will fall in love and might even major in computer science.

I have similar aspirations for the NPR API course on Codecademy. Perhaps the next generation of public radio listeners — those who don’t have radios and aren’t yet listening — will fall in love with this quirky open API that allows one to dabble with world-class content in the public interest. It’s important for us to be open and it’s important for us to be out there. Public radio has enough inspiration to share, we need the next generation of coders to help us realize our mission.

How News Foo changed me

This past weekend I was lucky enough to return to News Foo 2012. Elise, Derek*, Molly, Adriano, and Greg have all given excellent recaps, so I won’t try to duplicate those. Instead, I’ll talk about how I changed after attending News Foo 2011 and what I’ve learned since then.

First, News Foo is an event unlike any other I’ve ever attended. There have been a few experiences in my life that stand out for altering my outlook on what is achievable or for teaching me courage or grace. Bike racing, parenting, and marriage have each yielded a few of these and triggered irreversible personal growth. News Foo was the only thing remotely related to my media career that’s yielded that kind of perspective.

News Foo is a sort of intellectual Burning Man for those who are passionate about news and technology. It’s an intense experience lived over a very short period of time, where you will be exposed to more ideas and more incredible people than you can possibly absorb in a few years, let alone two days. It’s a hand-picked, curated crowd chosen to send each other sky-high in the hopes that we’ll return home to tackle big problems with renewed vigor, new perspective, and with each other’s help.

While I typically attend public unconferences, the private invite-only setting and FrieNDA ethic of News Foo offers a level of intimacy and openness I’ve never seen anywhere else. Campers range from up-and-comers to outright luminaries in fields that include media, technology, science, art, business, and academia. They’re also precisely the kind of individuals who want to approach old intractable problems with new optimism and creativity. John Bracken only half-jokingly quipped: “If this room exploded the Internet would be set back 10 years.” For most attendees, it is impossible to leave Phoenix uninspired. **

My first News Foo was an amazing blur. I did not heed the organizers’ advice and arrived at News Foo 2011 poorly rested. I was both overwhelmed and star-struck to be among so many people I’d admired from afar for so long. I gave an Ignite talk on reinventing audio. I participated in sessions ranging from remaking newsrooms to “Let’s invent the worst startup imaginable.” (One of the finalists was “Chew’d”, a food-truck franchise that serves — wait for it — pre-chewed food.) We got off campus for a hike and a trip to the botanical garden. I played an obscene amount of Werewolf that culminated in a total mind-screw.

I came to News Foo 2012 much more relaxed. I actually slept the week prior and did not commit to an Ignite this year (i.e. I was able to drink beers and relax). Sean Bonner, Nadav Aharony, John Keefe, Alex Howard, and I led a Saturday morning panel on sensors. A Saturday session on “Engineering Serendipity” got meta when session leader Ethan Zuckerman remarked that the News Foo attendee list was itself curated in favor of serendipity. Brian Fitzpatrick added that some individuals would never choose to attend such an event, further curating the experience. I took Sara’s advice and went to more panels that sounded offbeat and interesting. I commiserated with other fellow type-A’s trying to unplug and take a vacation and also shared “FAIL” stories with some incredible folks. We took a walk around Phoenix. A brief conversation with Jenny Lee and David Cohn turned into my personal self-help session. Even at News Foo, the best conversations often happen outside of sessions, and yes – at Werewolf. So of course, I played an obscene amount of Werewolf that culminated in a total mind-screw.

And what about the time in-between my first and second News Foos? One of my takeaways from News Foo 2011 was that I needed to share more. I did that this year both person-to-person and at events. While I survived my first Ignite talk, I found it to be such a valuable exercise that I decided to do more. I gave a talk on sensors for news at the TechRaking conference at Google and another humorous one on fashion, news and cognitive bias. It takes me upwards of 30 hours to prepare a five minute Ignite-style talk (generically a “lightning talk”), but the time and format constraints force you to be concise and clear while delivering a strong narrative arc. It’s something that’s greatly helped me communicate complicated ideas to diverse audiences. Those are skills I need to perfect to create the kind of impact I expect.

The inspiration is certainly powerful, but the most enduring gift of News Foo is the network. I’m not exactly a shrinking violet, and I have no shortage of people in my personal network. But this year, I chose to make Foo friends my primary network, and that has had a terrific impact on every action taken and every decision I’ve made since.

This year I’ve introduced campers to friends and professional acquaintances, helped them seal a few business deals, and called on them for advice. If you were in town for a few hours, I figured out how to get away from the office and grab coffee. Just yesterday I had lunch with Dan Oshinsky before he moves to NYC to join Buzz Feed. The Chicago Air Quality Egg Hackshop wouldn’t have happened last summer if I hadn’t met Fitz at News Foo, become friends, attended his ORDcamp, and then introduced his incredible Chicago community to Joe and Ed, the NYC-based leaders of the Egg project. Networks matter. If I met you at a camp, I immediately advanced you to “old friend” status. And why wouldn’t I, since we just endured a marathon of bliss together and we’re all so passionate about solving the same difficult problems? Whatever I put into this community, I get so much more in return.

News media is in the business of solving big societal information problems. We’re strapped in every way imaginable, and to be able to tap the opinions of exceptional friends has changed the way I plan long-term and the way I work day-to-day.

Was it hyperbole last year when I called News Foo “life-changing?” No, I think I just explained how I have indeed changed and why I’m better for it. There are so many wonderful people I’ve shared and interacted with this year, but I want to specifically thank Sara Winge, Richard Gingras, John Bracken, Tim O’Reilly, and Jenny 8 Lee for organizing the event.

It’s been a few days since I returned from News Foo 2012. I’m recovering and still supercharged, though I realize that the inevitable post-Foo withdrawal will soon arrive. I’ve been fortunate enough to attend two years in a row, which likely means it won’t be my turn again for some time. I’ll still be chasing crazy ideas and still be on the network ready to motivate and assist. Look for me, and don’t be shy about reaching out.

* Derek’s piece was excellent, and my only potential point of disagreement is that this perceived lack of ideological diversity has in it’s roots some very partisan stances that have no place at any open event that looks to promote understanding and tolerance. News Foo is not an inherently partisan event, and the majority of journalist attendees exemplify the non-partisan stances of their profession. The majority of non-journalism attendees show the same disdain for politics that most of the country does. That said, News Foo and its attendees embrace ethnic diversity, advancement of science, gender equality, and sexual orientation equality. These are non-partisan values that unfortunately have been politicized and rejected by one of the major political parties. The only esteemed News Foo value where both parties share an equally abysmal record is the Open Internet. I look forward to a time when the agreed upon starting point is that diversity, science, and equality are good things, and then we disagree on how to get there most quickly.

** Both in 2011 and 2012 I met a few campers who don’t see a path forward and were probably invited because they are in a position of responsibility and everyone hopes their internal switches will flip. It’s the job of everyone else to inspire them and convince them to shake things up. Unfortunately, it doesn’t always work.

A chat on NPR APIs, mobile apps, connected cars, and digital content

I was recently interviewed by Lee Dumond (@leedumond), Nick Berardi (@nberardi), and Dustin Davis(@prgrmrsunlmtd) for the latest Mashthis.io podcast. It was an interesting conversation in content syndication, app development, and using APIs to supercharge a content and platform strategy. 48 minutes is big commitment though, so I timecoded our discussion topics and added a few links for context. My favorites are bolded.

You can  listen here.  The audio is a bit choppy because of Skype difficulties. If you hear the same thing mentioned more than once, it’s probably one of our retakes.

Update 10/23/12

Nick Berardi asked a really important question on document databases vs. relational databases, and I didn’t answer it as well as his question deserved.

Nick noted the trend that organizations tended to have one relational database but that the document DB trend was “pick the best one for the job.” This is a hugely important insight, and there are two intertwined reasons: the intrinsic nature of the tools themselves, and the shift away from traditional client-server architecture to SOA/API-based architecture.

Document databases have so much variability between one another that you really have to pick the best tool based on the capabilities you need vs. the tradeoffs you’re willing to make. They differ by query capability, ACID compliance, clustering, retrieval speed, etc. On the other hand, traditional SQL databases might have 95% overlap, and you’d think that would make it easier to switch between them, but that last 5% has been so relied upon traditionally that it made lock-in the logical choice. Proprietary SQL syntax, stored procedures, or in the case of SQL Server DTS have traditionally been important features for querying and transforming data, and if you wanted to take advantage of them, you needed homogenous architecture.

That’s a good segue to the transition to data services — in our case, RESTful APIs. With standard interfaces you can untether  yourself from uniform infrastructure. Your data services may not actually run as fast as if you were syncing databases with proprietary services, but you’ve given yourself more flexibility to pick the best tool and move quickly. Every shop has its favorite tools, but we’ve already used both CouchDB and Redis on small internal services and can see us picking up another document DB here and there as needed.

Beyond flexibility, there’s a lot to be said about developer happiness and productivity. Both come from exploring new tools and picking the right tool for the job. That’s worth a lot.

Mashthis.io: The NPR API with Javaun Moradi

2:00 – Is NPR a truly digital system or a radio system with a digital presence?

  • My spiel on public radio vs. NPR.

4:14 – Getting 100+ local stations into the API

7:20 – What kind of APIs are under the hood in a mobile app?

9:50 – The challenges of streaming audio to mobile apps

13:00 – How has the API changed over the years….

  • Changed the way we write REST APIs
  • Moving at the speed of business and avoiding becoming big IT
  • ZaPHPa – a micro framework written by Irakli Nadareishvili and other members of our team
  • Story API permissions

16:40 – How do you decide when to write a new API?

19:40 – How do you keep up with documentation.

20:55 – Technology: hosting and programming languages

23:00 – Would we stay on PHP? How do we get more API speed?

25:50-  What is this drupal project?

29:00 – ElasticSearch + Document database.

  • We had a *lot* of retakes around document databases.
  • Besides speed of retrieval and flexibility, JSON is increasingly important to us.

31:15 – More on NoSQL databases

  • I forgot to mention we’ve also used Redis.

32:42 – The future of connected cars. Will there be a standard among manufacturers?

  • Will it be driven by smartphones like iOS and Android or by the manufacturers.

Lessons from the Chicago #AirQualityEgg hackathon

(This was originally published as #AirQualityEgg in Chicago… The First Deployment as a guest post on the Cosm.com blog, where it appeared with some additional edits by Ed Borden.)

On July 24 and 25, Air Quality Egg hackers from around the country gathered in Chicago for a quick sprint. Our objectives were to substantively advance the overall Air Quality Egg project while also developing a Chicago Egg community. Ed Borden, evangelist at Cosm.com and AQE project lead, had set an extremely ambitious agenda: solder 15 Nanode boards using the newest sensor configuration; modify, load, and test the on-board software; fabricate Egg enclosures; deploy sensors in the city of Chicago; and collect and attempt to visualize our data. It was a week’s worth of work and we had two days.

Community interest is tremendous and growing

Though we announced the Chicago Hackshop barely three weeks prior to the event date, we had enormous interest with many out-of-town attendees. Joe Saavedra and Ed Borden brought a large contingent from New York. David Holstius came from Berkeley, David Hsu from Penn. Richard Beckwith and Adam Laskowitz traveled from Portland. Matt Waite flew in from Lincoln, Nebraska. I flew in from D.C.

But interest from the City of Big Shoulders was the biggest reason we had such a successful event. The School of the Art Institute of Chicago  donated its world-class facilities and equipment, which included a general workspace, surface-mount soldering equipment, and the laser cutters and materials used in the enclosure development. Argonne National Lab helped promote the event within their network and sent several attendees.

The City of Chicago was very excited about the project, and though we’d only give them a few days notice, they offered to help us host Eggs on city property and scrambled to find suitable locations (more on that in a second…). Employees of the U.S. E.P.A.’s Chicago office also dropped by, both out of general curiosity and to answer questions about the EPA’s own monitoring sensors and air quality data. Our group was diverse and cross-disciplinary.

Besides attracting folks from Chicago and beyond we had old and young, experienced Egg hackers and first-timers, and backgrounds that spanned environmental science, architecture, design, journalism, the humanities, engineering, and computer science, to name a few.

IMG_8460

Deployment: people first

The physical requirements for the boards we used were wired Ethernet and power. We wanted to deploy near areas that were well-trafficked by people, with sensors placed at the height at which people congregate. Ideally, we wanted to be able to cluster many sensors close to each other and close to a calibrated EPA sensor to enable comparisons of Eggs vs. other Eggs and to a calibrated scientific-grade sensor.

Ed and Joe shared the people-first adoption approach that they’ve arrived at after months of experience. The Egg community is meant to extend far beyond those who attend hackathons. We need people who can adopt an Egg, deploy it at their home or work, maintain it over time, and share and promote the project over the long term.

Though we were very excited at the City’s gracious offer to host Eggs on city property in prime locations, given all of the uncertainty and new updates, what we needed most was committed volunteers to oversee each Egg. We stuck with the “people first, location second approach”. It has already proved to be a wise choice. We’ve had a few technical hiccups since the event, but the Eggs’ dedicated foster parents have made small adjustments and updated firmware.

Enclosure Fabrication

While the machine-fabricated Eggs for the Kickstarter campaign will use an injection molded case, Ed lamented that this would be the one part of the entire unit that DIY enthusiasts couldn’t build on their own. The team needed a case for the Chicago sensors, and this was the first crack at an open case design that could be downloaded and cut from simple materials or  pre-cut and shipped as a kit for self-assembly.

The group wanted something beautiful and functional. The case needed adequate venting for the micro controller, maximum airflow for the air sensors, and adequate resistance to precipitation. When the team saw the first completed case, our jaws dropped. The enclosure group came up with something that was both functional and gorgeous, right down to the “www.airqualityegg.com” etching A few tweaks will be needed to increase water resistance and make it less likely to trap heat under direct sunlight, but it was an incredible endeavor for two days and something to build upon in the future. Like all components of the Egg project, the design files are open source.

IMG_8486

Hardware

Miraculously, Joe cleared TSA security with a suitcase full of Nanode boards, sensors, and other assorted hardware. The Chicago sensors were built using the latest sensor configuration and contained on board temperature and humidity, as well as sensors for NO2 and CO. The bulk of the soldering was done by two high school students!

IMG_8454

Joe’s team made software updates to adjust the voltage based on temperature and humidity, giving the boards a way to compensate  according to climate conditions.

Data

While the rest of the team focused on building and deploying hardware, the data team had two full days to think about their task. First, we needed a yardstick by which to compare our Egg sensors. The closest official sensor was the Illinois EPA air sensor at 327 South Franklin Street in the Chicago loop. We worked with Tim Dye of SonomaTech, the company that manages air quality feeds for state EPA’s and the U.S. EPA. Tim gave us access the most granular hourly EPA data available, and we ingested itinto Cosm.com for ease of use. We started with a few questions we’d like to answer:

  • How does Egg data compare to the EPA data?
  • How can we compare Eggs for consistency?
  • How does my environment compare to that of other Eggs?
  • How can I explain what my Egg is doing to other people?
  • How can I find other Eggs (or other people) like me?

The group produced a few visualizations of Egg data (until the first Chicago Eggs were transmitting, we prototyped using a European Egg data feed) and EPA data using R and Processing, and created a few very basic maps.

IMG_8459

Wrap Up

As we left for the airport, the sensors were all out for installation. We’re grateful to our Chicago hosts, especially to Robb Drinkwater of the School of the Art Institute (SAIC) who handled all of the logistics and organization, Rajesh Sankaran of Argonne, whose technical expertise was hugely valuable both at the event and in the subsequent care and feeding of the Eggs. Above all, thanks to Charlie Catlett of SAIC and Argonne, who grabbed onto the first mention of a Chicago hackathon and never let it go.

IMG_8489

What’s next? The promise and challenge of citizen networks

Whether you’re a citizen enthusiast, a scientist, open data hacker, journalist, or general do-gooder, the idea of being able to look at local air quality measurements about your own surroundings, to visualize air conditions, contribute data to a larger community, and ultimately have a full picture of your region and others — is extremely compelling. Every task and every decision is in pursuit of this vision.

There are some real technical hurdles to achieving data quality, and it’s impossible to talk about the data without talking about the hardware itself. The inexpensive sensors on the Egg don’t actually measure carbon monoxide or nitrogen dioxide or any other air pollutants. They measure electrical resistance. A metal oxide sensor reacts with chemical compounds in the air, and that changes the overall resistance of the sensor. That’s all we can measure, and that’s why the raw readings coming off the sensor look like a generic “1234” and not “4 ppm”.

It gets more complicated. Temperature and humidity also affect sensor readings, which is why the Eggs have on-board sensors for both and attempt to compensate for fluctuations. A sensor for one pollutant may also be adversely affected by the presence of another — for example, NO2 and SO2 interfere with one another. EPA and other scientific sensors account for this and are also are regularly calibrated against a standard. It’s a precise, costly, time-consuming process, and not one that the Egg community seeks to replicate. Practically, this means that the Egg has no means to “zero” itself the way you might do with a bathroom scale before weighing yourself. Data quality is one of the greatest challenges the Air Quality Egg community faces. Different approaches are hotly debated every week on the Google Group. We won’t solve it in the next two months — let alone the two days we had in Chicago — but it’s a question that is perpetually on our minds.

After directional data accuracy, a big open question is how to make Egg data meaningful to people. Calibrated government and scientific sensor readings are verified for data quality and then translated into pollutant measurements, such as 8 ppb (parts per billion). While meaningful to scientists, the average citizen couldn’t tell you if such a reading is normal or a cause for concern. The U.S. EPA works with state air quality data to calculate a derived Air Quality Index (AQI) based on several measurements. The AQI translates to a color scale that typically fluctuates from green (healthy) to red (unhealthy) but does go all the way up to maroon (hazardous).We’re just scratching the surface on how to make data more meaningful and understandable to people. Can certain pollutants serve as “markers” of the current state? How do we communicate trend and concern, knowing that we won’t have the accuracy or authority of official measures? These are issues that we’ll continue to debate on the Google Group, as we prepare for the next Egg hackathon in Boston. Put October 11-13, 2012 on your calendar!

Transparency Camp 2012: Sensors and Civic Dialog

TCamp is one of the events I look forward to all year. It was really special this year because my friend Gregor Hackmack of Germany’s Parliament Watch was in town from Hamburg.
IMG_5916

Gregor and I have spent much of the last two years trying to bring his citizen/legislator Q&A platform to the U.S., and he’s in the process of open sourcing it. Parliament Watch has been an astounding success in Germany and has also seen success in Ireland, Austria, and Luxembourg. We held a session together and came seeking a modest U.S. pilot and may have a few interested takers. Here’s our session notes: How to bring Parliament Watch to your area.

I’ve always wanted to do a session with my friend Alex Howard. Alex has also written a ton about sensors and “citizens as sensors” in the context of civic media. We moderated an open conversation on open data opportunities for sensors and discussed projects as varied as Safecast, Trash Track, the Copenhagen Wheel, the Speeding Camera Lottery, and Asthmapolis. It was a ton of fun, we’ll do it again.

Session notes: What do sensors mean for open data?

Photo by Eric Gundersen

Sensors for news: a talk at TechRaking

Javaun Moradi of NPR speaks at TechRaking 2012.

A few months ago, I published some ideas about what the Internet of Things and inexpensive sensors might mean for journalism. It turned out to the most widely shared and cited piece I’ve ever written.

Last week I attended the TechRaking journalism conference at Google headquarters in Mountain View. I gave a 7 minute Ignite-style talk (a “lightning talk”) expanding on what sensors might mean for news and for engaging public media communities. The deck is below, I’ll add the video once the post it. Meghann Farnsworth recapped the event in detail.

Is NPR a Cult? Or is the world full of plaid?

Photo by Patrick Cooper

This month’s ONA DC meetup was hosted by NPR.  Apparently not having learned my lesson that lightning talks take a ton of time — or perhaps because Elise is so charming — I did another one. Also giving talks were Claire O’Neill, Clay Johnson, Michael Maness, Jon Bruner, who was down from New York. Patrick Cooper did a great writeup of the event.

 

Reinventing radio for digital platforms: My first Ignite Talk

(Or more properly stated: how do you deliver a killer 5 minute Ignite Talk when David Carr, Steven Levy, Tim O’Reilly, and a bunch of other folks you’ve respected for years are your audience.)

So… I survived my first Ignite Talk at News Foo 2011. I really had no idea what I was getting into when I signed up to speak. I read Scott Berkun’s excellent post, which gave me the idea to “hack the format” and use the same slide back-to-back for a 30 second effect while I played audio with my iPhone.

The subject of my talk was reinventing audio for digital devices. Broadcast listening is a linear, time-boxed experience. What might radio sound like if we invented it today? I drew on two projects I’ve worked on at NPR. The first was “You Are Here”, a location-based mobile audio project: what does radio sound like if we know where you’re standing? I got to work with NPR’s Robert Smith, an exceptionally talented storyteller. While “You Are Here” never saw the light of day, some of the lessons and some of the code made it into the second project.

The second project I discuss is the Infinite Player, a personalized, continuous listening experience. This one has gained media attention and our team at NPR believes it holds great promise.

I will do another Ignite Talk, but I can’t emphasize the amount of work it takes to even do a mediocre talk. I can prep a one hour presentation in 30 minutes. A five minute Ignite requires 30-40 hours worth of prep and practice.

A few takeaways:

  • You can never rehearse too much.
  • It’s better to practice all the way through than to keep stopping, as improvisation is key.
  • Edit yourself. You only have time for two points per slide, tops.
  • Sleep is key. I didn’t sleep at all heading into News Foo, and had such a good time I didn’t sleep there either.

How’d I do?

What do open sensor networks mean for journalism?

(NOTE: Pachube changed its name to Cosm.com in May, 2012. and then to Xively.com in May, 2013.)

If you’re a data journalist or a community activist and you haven’t heard of Pachube (pronounced “PATCH bay”), you should look them up. They’re trying to answer a question that no environmental group or government agency can answer right now: at any given time, how clean is the air in my neighborhood?

Pachube is about to pilot citizen-led air quality sensor networks in New York and Amsterdam. Pachube’s business is to become a data hub for the “internet of things” — internet connected objects and ambient sensors — allowing citizens to share meaningful data and learn from one another. Civic engagement is part of their mission.

The granular air quality data they’re attempting to capture doesn’t exist anywhere– you can download a snapshot of air quality data from the U.S. EPA, but there’s no real-time stream and the closest EPA sensor is likely miles from your home. Or at least much farther than a DIY sensor you can mount outside your window.

Sensors were on my mind again after all of the discussion of drone journalism last week at News Foo.  I was certain someone must already be doing this project and stumbled across Pachube during my search. I spoke with Pachube’s Ed Borden earlier this week, fully aware that neither he nor any of the volunteers he works with know how this will turn out. I was still very impressed with their level of organization, ambition, and common sense approach to the problem.

Citizen generated data is going to create two big opportunities for news organizations — or for whoever steps up to fill the need.

The first opportunity is that a lot more data is going to create a lot of areas for news reporting. There will be real-time data visualizations and gradiated maps. Experts may provide analysis for seasonal trends and prediction. And when sensor data shows a spike that may indicate negligent or criminal activity, it’s going to take a shoe-leather reporter to fully investigate the matter and bring responsible parties to bear.

The standard analogy is the weather, a $4 billion industry that relies almost exclusively on data gathered by NOAA. While we may not see another cottage news industry this large, the cumulative size of many new niche areas could easily exceed this. If stage 1 of data journalism was “find and scrape data.” , then stage 2 was “ask government agencies to release data” in easy to use formats. Stage 3 is going to be “make your own data”, and those sources of data are going to be automated and updated in real-time.

Inexpensive, ubiquitous citizen sensors aren’t going to have the precision (at the outset anyway) of more costly professional sensors, and journalists should embrace these networks anyway. These networks aren’t trying to replace scientific and government detection equipment, they’re trying to both fill a data gap and advance conversation.

Air quality is the perfect test for many reasons. The technology already exists. It’s a fundamentally local (really hyperlocal) issue but without measurement, it feels abstract. I live near an airport and one of the dirtiest coal-fired power plants on this side of the Mississippi. If our air happens to be dangerous, and if the parents on my block had even the crudest awareness of the air quality trend outside their front door, they might take action. Drastically change their kids’ routines. Or move. Or petition for a rigorous scientific study of local air quality.

The second opportunity that open sensor networks will present for news organizations is the same one they’re already hesitant to embrace on other civic engagement platforms.  (I’m going to take a narrower, more specific view than what Jonathan Stray, JC Stearns, and Melanie Sill wrote this week. But I largely agree with them that we need to be open to redefining how we achieve our missions.)

The open government movement has already spawned many startups to solve problems that citizens and media believe are worth solving. These startups almost always lack a captive media audience. They need help recruiting citizen participants and driving awareness of their platforms.  Whether they know it or not, they do need an objective third party to validate their work and give it authenticity. News organizations are uniquely positioned to serve as ethical overseers, moderators between antagonistic parties, or facilitators of open public dialog.

For lack of a better term, I’ll call this ‘citizen engagement journalism’: applying the newsroom’s tools and values to advance the cause of journalism by means other than reporting.

It’s a responsibility that is every bit as noble as reporting and can achieve the journalism goals of informing the public, investigating corruption, speaking for the voiceless, and seeking truth. The other side benefit is that local media can deeply engage with their audience in new ways.

I think a lot about this a lot, because the public radio system has so much untapped potential to ignite its communities. Public radio has a footprint in every local market, and we’ve known for years that our listeners are dying to interact with us beyond just turning on their radios.

Collaborating is a lot more complicated than it sounds. Startups may have an activist bent and are puzzled why local media isn’t giving them free promotion. News media may either fail to see the new opportunity to become a new kind of community steward, or they may feel genuinely threatened by new civic engagement platforms. Partnerships have been slow to materialize.

Neither startup civic platforms nor media really understand each others needs and boundaries, so we can’t pin the blame solely on media intransigence.

This deserves a followup post on how journalists can take a page from civic hackers and fill a new role. I promise to include some detailed examples of where this is already working and how it could be better.

In the meantime, what do you think civic hackers and journalists need to learn about each other to collaborate?

(UPDATE: I think we’re on to something here…)