Get your Hack On! Gnip Helps Power an App Developed at the 2011 TechCrunch Disrupt Hackathon

Over 500 individuals recently gathered in New York City for this year’s TechCrunch Disrupt Hackathon. This annual event, fueled by pizza, beer, and Red Bull, features teams of die-hard techies that spend 20 hours, many without sleep (hence the Red Bull), developing and coding the next big idea. Participants compete in a lightning round of pitches in front of a panel of judges with the winners receiving an opportunity to pitch on the main stage at the TechCrunch Disrupt Conference in front of more than 1,000 venture capitalists and industry insiders.

We are excited that one of the apps that was developed at the 2011 Hackathon was powered by Gnip data! We love it when our customers find new and creative ways to use the data we provide.

Edward Kim (@edwkim) and Eric Lubow (@elubow) from SimpleReach (@SimpleReach), which provides next generation social advertising for brands, put a team together to develop LinkCurrent, an app powered by Gnip data and designed to measure the current and future social value of a specific URL. When fully developed, the LinkCurrent app will provide the user with a realtime dashboard illustrating various measures of a URL’s worth — featuring an overall social score, statistics on the Klout Scores of people who have Tweeted the URL, how many times the URL has been Liked on Facebook and posted on Twitter, and geo-location information to provide insight into the content’s reach. Call it influence-scoring for web content.

The hackathon team also included Russ Bradberry (@devdazed) and Carlos Zendejas (@CLZen), also of SimpleReach, Jeff Boulet (@properslang) of EastMedia/Boxcar (@eastmedia/@boxcar), Ryan Witt (@onecreativenerd) of Opani (@TheOpanis), and Michael Nutt (@michaeln3) of Movable Ink (@movableink)– Congratulations to everyone who participated! You created an amazing app in less than 20 hours and developed a creative new use for Gnip data. I highly encourage all of you to check it out: www.linkcurrent.co

Have fun and creative way you have used data delivered by Gnip? We would love to hear about it and you could be featured in our next blog. Drop us an email or give us a call at 888.777.7405.

New Gnip Push API Service

The Gnip product offerings are growing today as we officially announce a new Push API Service that will help companies more quickly and effectively deliver data to their customers, partners and affiliates.  (See the TechCrunch article: Gnip Launches Push API To Create Real-Time Stream Of Business Data )

This new offering leverages the Gnip SaaS Integration Platform but is provided as a complete white-label and embeddable solution adding real-time push to an existing infrastructure.  The main capabilities include the following:

  • Push Endpoint Management: Easily register service endpoints and APIs to create alternative Push endpoints that are powered by the Gnip platform.
  • Real-time Data Delivery: Complete white-label approach allows for company defined URLs to be enhanced for real-time data delivery. Reduce your data latency and infrastructure costs while maintaining control of data access and offloading the delivery to Gnip
  • Reporting Dashboard: Access important metrics and usage information for service endpoints through a statistics API or a web-based dashboard

A company would be able to add the Push API Service  to their existing infrastructure in hours or days with a few steps.

  1. Tell us how you want Gnip to access your data and APIs as we have several methods depending on your infrastructure
  2. Integrate the Gnip Push API Service to your website with complete control of the user experience and branding
  3. CNAME the subdomain in order to seamlessly add the Push API Service to your existing infrastructure
  4. Track usage of the new Push Service API using a web-based console or stats API

If your company is intersted in learning more about how Gnip can help move your existing repetitive API and website traffic to a more efficient push based approach contact us at info@gnip.com.

Gnip Conference Schedule for July

The real-time internet is getting more attention and with that attention comes places for people to talk about what it all means.   Since Gnip focuses on devliering data for the real-time web we end up being in on some part of the conversation and many solutions.

If you are planning or thinking about attending one of the upcoming conferences let us know as we will be attending.

1. Real-Time Stream CrunchUp, July 10th, Redwood City, CA

URL: http://www.techcrunch.com/real-time-stream-and-4th-annual-crunchup-at-august-capital/

crunchup-1

 

2. Real-time Web 2009 Conference, July 29th, Mountain View, CA.

URL: http://rtw09.com/

rtw09logo

Got Gnip? Give Us Some Support in The 2008 Crunchies

Are you using Gnip and love getting real-time access to data from various web APIs across the Internet? Show us some love and a vote or two in the TechCrunch 2008 Crunchies

Best New Startup of 2008

and

Best Technology Innovation/Achievement

Thanks and look for information on our next release in a few days.

The WHAT of Gnip: Changing APIs from Pull to Push

A few months ago a handful of folks came together and took a practical look at the state of “web services” on the network today. As an industry we’ve enjoyed the explosion of web APIs over the past several years, but it’s been “every man for himself,” and we’ve been left with hundreds of web APIs being consumed in random ways (random protocols and formats). There have been a few cracks at standardizing some of this, but most have been left in spec form with, at best, fragmented implementations, and most have been too high level to provide anything more than good bedtime reading. We set out to build something; not write a story.

For a great overview of the situation Gnip is plunging into, checkout Nik Cubrilovic’s post on techcrunchIT; “The New Datastream Aggregators, FriendFeed and Standards.”.

Our first service is the culmination of lots of work by smart, pragmatic, people. From day one we’ve had excellent partners helping us along the way; from early integrations with our API, to discussing specifications and standards to follow (or not to follow; what you chose not to do is often more important than what you chose to do). While we aspire to solve all of the challenges in the data portability space, we’re a small team biting off small chunks along a path. We are going to need the support, feedback, and assistance of the broader data portability (formal & informal) community in order to succeed. Now that we’ve finally launched, we’ll be in “release early, release often” mode to ensure tight feedback loops around our products.

Enough; what did we build!?!

For those who want to cut to the chase, here’s our API doc.

We built a system that connects Data Consumers to Data Publishers in a low-latency, highly-scalable standards-based way. Data can be pushed or pulled into Gnip (via XMPP, Atom, RSS, REST) and it can be pushed or pulled out of Gnip (currently only via REST, but the rest to follow). This release of Gnip is focused on propagating user generated activity events from point A to point B. Activity XML provides a terse format for Data Publishers to distribute their user’s activities. Collections XML provides a simple way for Data Consumers to only receive information about the users they care about. This release is about “change notification,” and a subsequent release will include the actual data along with the event.

 

As a Consumer, whether your application model is event- or polling-based Gnip can get you near-realtime activity information about the users you care about. Our goal is a maximum 60 second latency for any activity that occurs on the network. While the time our service implementation takes to drive activities from end to end is measured in milliseconds, we need some room to breathe.

Data can come in to Gnip via many formats, but it is XSLT’d into a normalized Activity XML format which makes consuming activity events (e.g. “Joe dugg a news story at 10am”) from a wide array of Publishers a breeze. Along the way we started cringing at the verb/activity overlap between various Publishers; did Jane “tweet” or “post”, they’re kinda the same thing? After sitting down with Chris Messina, it became clear that everyone else was cringing too. A verb/activity normalization table has been started, and Gnip is going to distill the cornucopia of activities into a common, community derived, format in order to make consumption even easier.

Data Publishers now have a central clearinghouse to push data when events on their services occur. Gnip manages the relationship with Data Consumers, and figures out which protocols and formats they want to play with. It will take awhile for the system to reach equilibrium with Gnip, but once it does, API balance will be reached; Publishers will notify Gnip when things happen, and Gnip will fan-out those events to an arbitrary number of Consumers in real-time (no throttling, no rate limiting).

Gnip is centralized. After much consternation, we resolved to start out with a centralized model. Not necessarily because we think it’s the best path, but because it is the best path to get something started. Imagine the internet as a clustered application; decentralization is fundamental (DNS comes to mind). That said, we needed a starting point and now we have one. A conversation with Chris Saad highlighted some work Paul Jones (among others) had done around a standard mechanism for change notification discovery and subscription; getpingd. Getpingd describes a mechanism for distributed change notification. The Subscription side of getpingd feels like a no-brainer for Gnip to support, but I’m not sure how to consider the Discovery end of it. In some sense, I see Gnip (assuming getpingd’s discovery model is implemented) as a getpingd node in the graph. We have lots to consider in the federated/distributed model.

Gnip is a classic chicken-and-egg scenario, we need Publishers & Consumers to be interesting. If your service produces events that you want others on the network to consume, we’d love to see you as a Publisher in Gnip; pushing events into the system for wide consumption. If your service relies on events created by users on other applications, we’d love to see you as a Consumer in Gnip.

We’ve started out with convenience libraries for perl, php, java, python, and ruby. Rather than maintain these ourselves, we plan on publishing them to respective language community code sites/repositories.

That’s what we’ve built in a nutshell. I’ll soon blog about exactly how we’ve built it.