Gnip and Twitter join forces

Gnip is one of the world’s largest and most trusted providers of social data. We partnered with Twitter four years ago to make it easier for organizations to realize the benefits of analyzing data across every public Tweet. The results have exceeded our wildest expectations. We have delivered more than 2.3 trillion Tweets to customers in 42 countries who use those Tweets to provide insights to a multitude of industries including business intelligence, marketing, finance, professional services, and public relations.

Today I’m pleased to announce that Twitter has agreed to acquire Gnip! Combining forces with Twitter allows us to go much faster and much deeper. We’ll be able to support a broader set of use cases across a diverse set of users including brands, universities, agencies, and developers big and small. Joining Twitter also provides us access to resources and infrastructure to scale to the next level and offer new products and solutions.

This acquisition signals clear recognition that investments in social data are healthier than ever. Our customers can continue to build and innovate on one of the world’s largest and most trusted providers of social data and the foundation for innovation is now even stronger. We will continue to serve you with the best data products available and will be introducing new offerings with Twitter to better meet your needs and help you continue to deliver truly innovative solutions.

Finally, a huge thank you to the team at Gnip who have poured their hearts and souls into this business over the last 6 years. My thanks to them for all the work they’ve done to get us to this point.

We are excited for this next step and look forward to sharing more with you in the coming months. Stay tuned!

Social Data: What’s Next in Finance?

After a couple exciting years in social finance and some major events, we’re back with an update to our previous paper “Social Media in Markets: The New Frontier”. We’re excited to be able to provide this broad update on a rapidly evolving and increasingly important segment of financial service.

Social media analytics for finance has lagged brand analytics by 3 to 4 years despite being an enormous potential for profit through investing based on social insights. Our whitepaper explains why that gap has existed and what has changed in the social media ecosystem that is causing that gap to close. Twitter conversation around tagged equities has grown by more than 500% since 2011. The whitepaper explores what that means for investors.


We examine the finance specific tools that have emerged as well as outline a framework for unlocking the value in social data for tools that are yet to be created. Then we provide an overview of changes in academic research, social content, and social analytics for finance providers that will help financial firms figure out how to capitalize on opportunities to generate alpha.

Download our new whitepaper.

Twitter, you've come a long way baby… #8years

Like a child’s first steps or your first experiment with pop rocks candy, the first ever Tweet went down in the Internet history books eight years ago today. On March 21st, 2006, Jack Dorsey, co-founder of Twitter published this.


Twttr, (the service’s first name), was launched to the public on July 15, 2006 where it was recognized for “good execution on a simple but viral idea.” Eight years later, that seems to have held true.

It has become the digital watering hole, the newsroom, the customer service do’s and don’ts, a place to store your witty jargon that would just be weird to say openly at your desk. And then there is that overly happy person you thought couldn’t actually exist, standing in front of you in line, and you just favorited their selfie #blessed. Well, this is awkward.

Just eight months after their release, the company made a sweeping entrance into SXSW 2007 sparking the platforms usage to balloon from 20,000 to 60,000 Tweets per day. Thus beginning the era of our public everyday lives being archived in 140 character tidbits. The manual “RT” turned into a click of a button, and favorites became the digital head nod. I see you.

In April 2009, Twitter launched the Trending Topics sidebar, identifying popular current world events and modish hashtags. Verified accounts became available that summer; Athletes, actors, and icons alike began to display the “verified account” tag on their Twitter pages. This increasingly became a necessity in recognizing the real Miley Cyrus vs. Justin Bieber. If differences do exist.

The Twitter Firehose launched in March 2010. By giving Gnip access, a new door had opened into the social data industry and come November, filtered access to social data was born. Twitter turned to Gnip to be their first partner serving the commercial market. By offering complete access to the full firehose of publicly-available Tweets under enterprise terms, this partnership enabled companies to build more advanced analytics solutions with the knowledge that they would have ongoing access to the underlying data. This was a key inflection point in the growth of the social data ecosystem. By April, Gnip played a key role in the delivering past and future Twitter data to the Library of Congress for historic preservation in the archives.

July 31, 2010, Twitter hit their 20 billionth Tweet milestone, or as we like to call it, twilestone. It is the platform of hashtags and Retweets, celebrities and nobodies, at-replies, political rants, entertainment 411 and “pics or it didn’t happen.” By June 1st, 2011, Twitter allowed just that as it broke into the photo sharing space, allowing users to upload their photos straight to their personal handle.

One of the most highly requested features was the ability to get historical Tweets. In March 2012, Gnip delivered just that by making every public Tweet available starting from March 21, 2006 by Mr. Dorsey himself.

Fast forward 8 years, Twitter is reporting over 500 million Tweets per day. That’s more than 25,000 times the amount of Tweets-per-day in just 8 years! With over 2 billion accounts, over a quarter of the world’s population, Twitter ranks high among the top websites visited everyday. Here’s to the times where we write our Twitter handle on our conference name tags instead of our birth names, and prefer to be tweeted at than texted. Voicemails? Ain’t nobody got time for that.

Twitter launched a special surprise for its 8th birthday. Want to check out your first tweet?

“There’s a #FirstTweet for everything.” Happy Anniversary!


See more memorable Twitter milestones

The History Of Social Data In One Beautiful Timeline

As a company that’s constantly innovating and driving forward, it’s sometimes easy to forget everything that’s led us to where we are today. When Gnip was founded 6 years ago, social data was in its infancy. Twitter produced only 300,000 Tweets per day; social data APIs were either non-existent or unreliable; and nobody had any idea what a selfie was.

Today social data analytics drives decisions in every industry you can imagine, from consumer brands to finance to the public sector to industrial goods. From then to now, there have been dozens of milestones that have helped create the social data industry and we thought it would be fun to highlight and detail all of them in one place.


The story begins humbly in Boulder, Colorado with the concept of changing the way data was gathered from public APIs of social networks. Normally, one would ‘ping’ the API and ask for data, Gnip wanted to reverse that structure (hence our name). In these early days, we focused on simplifying access to existing public APIs but our customers constantly asked us how they could get more and better access to social data. In November of 2010, we were finally able to better meet their needs when we partnered with Twitter to provide access to full Firehose of public Tweets, the first partnership of its kind.

This is when Gnip started to build the tools that have shaped the social data industry. While getting a Firehose of Tweets was great for the industry, the reality was our customers didn’t need 100% of all Tweets, they needed 100% of relevant Tweets. We created PowerTrack to enable sophisticated filtering on the full Firehose of Tweets and solve that problem. We also built valuable enrichments, reliability products, and historical data access to create the most robust Twitter data access available.

While Twitter data was where the industry started, our customers wanted data from other social networks as well. We soon created partnerships with Klout, StockTwits, WordPress, Disqus, Tumblr, Foursquare and others to be the first to bring their data to the market. Our work didn’t end there though. We have been continually adding in new sources, new enrichments, and new products. We also launched the first conference dedicated to social data as well as the first industry organization for social data. Things have come a long way in 6 years and we can’t wait to see the developments in the next 6 years.

Check out our interactive timeline for the full list of milestones and details.


Gnip's Social Data Picks for SXSW

If you’re one of 30,000 headed to SXSW, we’ve got our social data and data science panel picks for SXSW that you should attend between BBQ and breakfast tacos. Also, if you’re interested in hanging with Gnip, we’ve listed the places we’ll have a presence at!

Also, we’ll be helping put on the Big Boulder: Boots & Bourbon party at SXSW for folks in the social data industry. Send an email to for an invite.


What Social Media Analytics Can’t Tell You
Friday, March 7 at 3:30 PM to 4:30 PM: Sheraton Austin, EFGH

Great panel with Vision Critical, Crowd Companies and more. “Whether you’re looking for fresh insight on what makes social media users tick, or trying to expand your own monitoring and analytics program, this session will give you a first look at the latest data and research methods.”

Book Signing – John Foreman, Chief Data Scientist at MailChimp
Friday, March 7 at 3:50 to 4:10 PM: Austin Convention Center, Ballroom D Foyer

During an interview with Gnip, John said that the data challenge he’d most like to solve is the Taco Bell menu. You should definitely get his book and get it signed.


Truth Will Set You Free but Data Will Piss You Off
Saturday, March 8 from 3:30 to 4:30 PM: Sheraton Austin Creekside

All-star speakers from DataKind, Periscopic and more talking about “the issues and ethics around data visualization–a subject of recent debate in the data visualization community–and suggest how we can use data in tandem with social responsibility.”

Keeping Score in Social: It’s More than Likes
Saturday, March 8 from 5:15 to 5:30 PM: Austin Convention Center, Ballroom F

Jim Rudden, the CMO of Spredfast, brands will talk about “what it takes to move beyond measuring likes to measuring real social impact.”


Mentor Session: Emi Hofmeister
Sunday, March 9 at 11 AM to 12 PM: Hilton Garden Inn, 10th Floor Atrium

Meet with Emi Hofmeister, the senior product marketing manager at Adobe Social. All sessions appear to be booked but keep an eye out for cancellations. Sign up here:

The Science of Predicting Earned Media
Sunday, March 9 at 12:30 to 1:30 PM: Sheraton Austin, EFGH

“In this panel session, renowned video advertising expert Brian Shin, Founder and CEO at Visible Measures, Seraj Bharwani, Chief Analytics Officer at Visible Measures, along with Kate Sirkin, Executive Vice President, Global Research at Starcom MediaVest Group, will go through the models built to quantify the impact of earned media, so that brands can not only plan for it, but optimize and repeat it.”

GNIP EVENT: Beyond Dots on a Map: Visualizing 3 Billion Tweets
Sunday, March 9 at 1:00-1:15 PM: Austin Convention Center, Ballroom E

Gnip’s product manager, Ian Cairns, will be speaking about the massive Twitter visualization Mapbox and Gnip created and what 3 billion geotagged Tweets can tell us.

Mentor Session: Jenn Deering Davis
Sunday, March 9 at 5 to 6 PM: Hilton Garden Inn, 10th Floor Atrium

SIgn up for a mentoring session with Jenn Deering Davis, the co-founder of Union Metrics. Sign up here -

Algorithms, Journalism & Democracy
Sunday, March 9 from 5 to 6 PM: Austin Convention Center, Room 12AB

Read our interview with Gilad Lotan of betaworks on his SXSW session and data science. Gilad will be joined by Kelly McBride of the Poynter Institute about the ways algorithms are biased in ways that we might not think about it. “Understanding how algorithms control and manipulate your world is key to becoming truly literate in today’s world.”


Scientist to Storyteller: How to Narrate Data
Monday, March 10 at 12:30 – 1:30 PM: Four Seasons Ballroom

See our interview with Eric Swayne about this SXSW session and data narration. On the session, “We will understand what a data-driven insight truly IS, and how we can help organizations not only understand it, but act on it.”

#Occupygezi Movement: A Turkish Twitter Revolution
Monday, March 10 at 12:30 – 1:30 PM:  Austin Convention Center, Room 5ABC

See our interview with Yalçin Pembeciogli about how the Occupygezi movement was affected by the use of Twitter. “We hope to show you the social and political side of the movements and explain how social media enabled this movement to be organic and leaderless with many cases and stories.”

GNIP EVENT: Dive Into Social Media Analytics
Monday, March 10 at 3:30 – 4:30 PM: Hilton Austin Downtown, Salon B

Gnip’s VP of Product, Rob Johnson, will be speaking alongside IBM about “how startups can push the boundaries of what is possible by capturing and analyzing data and using the insights gained to transform the business while blowing away the competition.”


Measure This; Change the World
Tuesday, March 11 at 11 AM to 12 PM: Sheraton Austin, EFGH

A panel with folks from Intel, Cornell, Knowable Research, etc. looking at what we can learn from social scientists and how they measure vs how marketers measure.

Make Love with Your Data
Tuesday, March 11 at 3:30 to 4:30 PM: Sheraton Austin, Capitol ABCD

This session is from the founder of OkCupid, Christian Rudder. I interviewed Christian previously and am a big fan. “We’ll interweave the story of our company with the story of our users, and by the end you will leave with a better understanding of not just OkCupid and data, but of human nature.”

9 Questions To Ask Your Social Data Provider

The decision of who to get social data from is not necessarily an easy one. The reality is that each business has unique social data needs, yet there is no blueprint for how to determine your needs. While your social data provider should be able to guide you in the right direction, picking a social data provider in the first place is just as tough. Here are some questions that you can use when determining your needs and evaluating social data providers:

1. Can you provide me with all of the data that I need?
This is one of the most important considerations. It is especially important to think about this question on two dimensions. The first dimension is, does your social data provider have access to all the sources that you need? The second dimension is, does your social data provider have access to complete data from those sources?

Dimension 1:
In terms of access to social data, wanting Twitter data is a common place to start, however complete analysis comes from having data from any source that is relevant to what you need to analyze. Consider things like physical location, demographic of audience, and types of interactions desired and you’ll quickly realize that sources like Tumblr, Foursquare, WordPress, Disqus and others are critically important to creating a full view of the conversation. Make sure your social data provider can give you all the data you need.


Dimension 2:
When it comes to a social data provider offering complete data from a source, it is important to note that it is entirely up to the source whether they offer up all of their public data and who they offer complete data through. Some data sources provide complete access, while others do not. Without complete access to a source a provider cannot claim to be able to give you all of the data you need from a given source. Complete access simply means that a provider receives a stream of data from the source that contains all of the public data available. This is also known as firehose access. Sometimes sources don’t allow for complete access in which case you should verify that they can optimize the requests sent to get as much data as the provider will allow.


2. Can you offer me the level of reliability I need?
Social analysis is only as accurate as the social data analyzed. What kind of reliability can your social data provider offer? If you disconnect from the stream, can they still provide you the data that was missed? Can they do that automatically for you? What about if you’re disconnected for an extended period of time? Those are all really important safety net considerations but there is a form of reliability that is even better. Redundancy. You should make sure your data provider offers the ability to consume a second replicated stream along with your production stream. A redundant stream can prevent missed data from occurring before it even happens. Finally, check to see if your social data provider can tell you if you’ve missed any data. Sometimes you think you may have missed something important, your social data provider should be able to tell you whether you’ve missed something important or if you’re getting all the data you should.

3. Do you provide ways for me to get only the data I need?
Ingesting and storing a firehose of data is too complicated and expensive for most companies to handle. Your social data provider should allow you to filter the firehose of data to get only the data you need. Your social data provider should allow you to filter the data based on what’s important to your business. Things you may want to filter by are keywords, phrases, from and to operators, contain operators, language, location, and type, although there’s other things that may be important for your analysis. Make sure your social provider allows you to filter to get exactly what you need.

4. Can I update my filters quickly and easily, without losing data?
Beyond providing ways to filter the social data coming from the sources, allowing for the ability to update those filters quickly and easily is important. When you consider how quickly social conversation occurs, having to manually update multiple streams can cause you to miss a lot of important conversation. Does your data provider allow you to update, manage, and organize filters through a single connection, dynamically? Does it allow you to update all of your filters through a single connection, dynamically as opposed to a connection for each filter set? Is this available through an API to happen instantaneously? And are disconnections required to update the rule set? If so, this could cause you to miss data as the system disconnects and then reconnects.

5. Do you offer historical data? And if so, how is it delivered?
Realtime data is the cornerstone of the social analytics industry, but historical data can allow you to analyze so much more and analyze data in new ways. Check with your social data provider to see what historical social data they can make available to you. Historical data can be delivered immediately or can be made available as a batch job. Depending on your need, complexity, and budget you may only need one form of delivery or you may end up using both. Consider whether you need historical data and if you do make sure your data provider can get you the historical data you need.

6. What kind of metadata enrichments do you offer?
While the data from the source is primarily what you’re after, there’s additional data that can help you do better analyses. See what additional data your provider can include and determine if it is relevant for you. Is this data redundant to something your business excels at? Does this additional data provide additional value that you wouldn’t otherwise have? Enrichments in your data stream such as location or influence data can mean the difference between a generic analysis and great insights.

7. Who else relies on your data?
Sometimes the greatest tell of a company is who trusts them. This is especially true with social data where many businesses, big and small, rely on the data as the foundation of their business. Look at who your provider can offer as reference customers and if those companies have similar needs as you.

8. What are you doing to make sure I will continue to get the social data I need?
Social data can’t be here today and gone tomorrow. Consistent, long-term data access means compliance with terms of service, long-term contracts and economics where everyone succeeds. Make sure your data provider is working directly with the sources on things like sustainability and policies. Susan Etlinger of Altimeter has a great post on why getting data from the source matters. Make sure your data provider is giving you data that’s compliant with the rules of the source providing the data. Is your social data provider involved in industry advocacy and improving data quality? Building analytics isn’t easy or cheap, make sure you’re working with a data provider that’s investing in your longevity and success.

9. What is your pricing based on?
Figuring out how to price social data is not an easy thing, and different data providers tackle that problem differently. At the end of the day, your goal should be to make sure you understand the factors that go into determining your price and finding a package that meets your needs.

This list is not meant to be exhaustive, there are many other things you should consider when choosing a social data provider. You should make sure to document and ask the questions that are important to you, hopefully this list helps you get started.

Social Data: Use Cases for the Public Sector

Social Data in Public Sector

In a blog post last week, we outlined how Microsoft Research was able to indicate whether someone was depressed based on their activity on Twitter, which is groundbreaking research. And this week we looked at how social data can be used for tracking food poisoning outbreaks. We’ve also seen several amazing examples of practical applications of social data as a critical signal and life-saving data source during disaster situations. We still think this is just the beginning in how social data can be used in this sector.

Gnip’s first whitepaper on Social Data in the Public Sector helped outline what social data is, how it can be used and the current implications of using social data. Due to the success of this whitepaper, we wanted to create a followup ebook.

This ebook highlights the use cases of social data in government, and how organizations can determine the right social data for their needs. We’ll outline cases of how social data is used in epidemiology, natural disaster relief, political campaigns, city planning, law enforcement and government surveys. You can download the whitepaper here and send any questions you have to

Download the whitepaper here.

Profile Geo: When You Need More Geodata In Your Twitter Data

Sometimes in the world of social data it is hard to grasp the amazing possibilities when we use words to describe things. The old adage that a picture is worth a thousand words is true, so we wanted to show you what our new Profile Geo enrichment does.

First, here is what Profile Geo is:
Gnip’s Profile Geo enrichment significantly increases the amount of usable geodata for Twitter. It normalizes unstructured location data from Twitter users’ bio locations and matches those latitude/longitude coordinates for those normalized places. For examples, everyone who mentions “NYC,” New York City,” “Manhattan,” and even some odd instances like “NYC Baby✌” all get normalized to “New York City, New York, United States” so they’re easy to map.

Now, here is what Profile Geo does in practice for users interested in Twitter geodata:
Football Geo

We think this is really powerful stuff. These maps were created using 2 sets of Tweets taken over 3 Sundays where we were looking for Tweets containing the term “football.” The map for Standard Geo is comprised of Tweets where users specifically geotagged their Tweet with their latitude and longitude (natively in the Twitter payload). The map for Profile Geo is comprised of Tweets where Gnip was able to enrich additional Tweets and assign the Tweet to a latitude and longitude.

As you can see the amount of location data available through Profile Geo is significantly higher than through Standard Geo. To be specific, we did our “football” search using the Decahose, a random sampling of 10% of the full Twitter firehose. Standard Geo returned just under 3,000 Tweets, while the Profile Geo search returned more than 40,000 Tweets! (Multiply those by 10 to get approximations of firehose volumes) With this additional geodata the possibilities are limitless. The NFL can understand the demographics of their demand better, football clubs in the UK can see how far their reach is, TV networks can use this data to tailor media, among infinite other uses.

If you were to remove the search for “football” and use the entire firehose of Twitter data you’d find that you can receive roughly 15 times the amount of geo-relevant data by using Gnip’s Profile Geo enrichment instead of just the geodata in the standard stream. Anyone using geodata in their social data analyses should find great value in this dramatic increase in georelevant data.

If images are better than words, then interactive maps are better than images. Here are the maps so you can play around and see the difference yourself. Zooming in will depict just how much more data is available with Profile Geo in clear detail:
Continue reading

State of Social Media in the Financial Sector

In the two years that Gnip has been working with the financial industry, the state of social data and social media in the financial sector has changed dramatically. Here’s a look at where the industry stands and where we think it’ll go.

Social Media Moves Markets

Perhaps the highest profile event to involve social media in the financial sector this year was the Hash Crash.

On April 23, the AP Twitter account was hacked and tweeted that two explosions in the White House had injured Obama. The result? The Dow dropped more than 140 points within two minutes. It was an eye opener for many, on the power of social media to move markets. But what got lost in the coverage of this incident was that Twitter was part of the reason that the market rebounded so quickly. What Gnip has seen time and again is that rumors on Twitter are defeated just as quick as a rumor is started. Immediately after the initial Tweet, many others started to debunk the rumor giving live reports from the White House and setting the record straight.

The Hash Crash provided another, audible and clear reason for using social data if you are a participant in the financial markets. Our hedge fund and asset management customers have known this for sometime. If you weren’t following and analyzing social, you were most likely slower than others to understand what was happening in the market – in the dark.

SEC and Reporting on Social Media

Another big change to shape the industry was an official clarification in SEC policy on social media from the Securities and Exchange Commission allowing companies to announce key information on social media as long as investors knew that such channels would be used. As of today, more than 150 companies are using social media to report financial results or performance. Real-Estate Tech company Zillow, Nasdaq:Z ($Z) took this concept even further, opening their earnings Q&A up to questions from Twitter. Earnings calls have always been intended to provide color and transparency for all investors and potential investors of a publicly traded company, but the reality has been that they have been events attended and monitored almost exclusively by investment professionals. Opening up the announcement and especially the Q&A portion to Twitter isn’t as much a radical new move as it is a use of new technology to help re-align these events with their initial intent to give everyone access to information on the company to make investment decisions.

Social Data in the Markets

When Gnip first started looking at the ways the financial markets could use social data, we never would have guessed how fast the market would grow and how hungry people would be for data. In two years, we’ve seen large-scale growth of large hedge funds using Twitter social data as part of their trading strategies. Twitter provides a broad based stream that can answer questions about sentiment about companies, brands, ideas and rumors.  Investors are finding value both through intelligent aggregation and data mining. When a merger rumor is breaking, you can find speculative deal values on Twitter before official numbers have been released. In addition to Twitter, financial institutions have found value in similar content from Stocktwits as well. Stocktwits has a curated community of financial investors who buy into sharing their thoughts online. Stocktwits has been especially valuable for traders and hedge funds who don’t want to sift through the noise on Twitter.  If you search for Justin Bieber on StockTwits, you won’t find anything.


And earlier this year, Gnip signed a partnership with Estimize, a crowdsourced earnings estimates platform that provides open sourced financial estimates with incredible transparency, making it a valuable and unique set of social data. Estimize has a platform to capture and provide structure around the long explored concept of a whisper number. They’ve recently added Vinish Jha, a former Starmine Quant, to help add a layer of intelligent analytics on top of the open community, and to really work towards an open estimate that includes only the most accurate prognosticators.

The Adoption of Social Data in Trading Terminals

One of the oft passed around anecdotes at Gnip is how financial institutions talk about traders and analysts using their iPhones under the desk so they can keep an eye on Twitter. Due to regulation, most banks or brokerages don’t allow traders to post or use social media. To enable traders and analysts to access social media (but not to post) a number of banks and terminal providers have been adding social data to terminals  - thus enabling users to at least look up conversations and research online.  In the case of Bloomberg, for now they provide a curated feed, so it isn’t always the complete and full conversations.

New Uses – Risk Management

Over the next two years the acceptance of correlations between stock prices and social data will allow for deeper insights. The area I see making the most progress is in risk management. A good portion of making money in investing is figuring out how not to lose money.   With the S&P on a 5 year growth run, it’s no secret that there is a risk of a pullback, the big question is when?

Social data allows for risk modeling that removes one of the inherent biases of price/volume based modeling. Price and volumes of a security or asset only move when investors are ready to take action. Social media volumes and sentiment move around thought and discussion. Given the hope that thought and discussion still generally precedes action in the strategy of most investors, there exists a huge opportunity to pick up on early, previously undetectable correlations between companies and concepts.  A quick teaser example below shows normalized rolling 24 hour Twitter volumes for 2 related securities (LNKD and FB) and two unrelated securities (LNKD and IBM).  In the next year I expect more companies to start looking at these types of correlations for risk management, both between securities and concepts like “government shutdown”.


So Where Are We Headed?

Many of the initial uses cases have been reading social media for actionable trade ideas. The growing number of firms trying to offer social media based signals shows the success in this area. The next 1-2 years will be about expansion in two directions:  improvements in implementation/standardization and expansion of insights. Now that social data has made it through the sandbox phase for certain applications, the focus turns to integrating with existing processes and data sets. The most successful aggregators and indicators will partner with exchanges and traditional financial data vendors to help their data flow through to existing trading and research systems making the information more broadly accessible and cheaper to implement. On the raw data side, more tools will emerge to standardize linking data back to existing security/company identifiers and accepted industry and index classifications.

Using social data in the financial sector is fast becoming a must have, not a nice to have.


If Your Brand Falls in the Woods and No One Hears it, Does it Make a Sound?

For a percentage of iMessage users in the world, there was a frustrating period in recent weeks that has caused some angst, to say the least. There’s nothing worse than relying on a system that suddenly starts to fail you. Thankfully, for most, iMessage hasn’t quite hit the classification of enterprise level, but it’s definitely in play as a communication system for a lot of folks in the workplace. For a majority of users though, it just means they missed a note from their spouse about picking up some bread on the way home from work.

When dealing with software, there are differing levels of importance (and opinion) about the quality of your software and how it matters to its users. For folks on the space shuttle, it’s pretty important that software is functioning correctly and is tested without fail. For medical device folks, similar story.  Compare that to Candy Crush, and well, you get the point. What enterprises are finding more and more are the “mission critical” social aspects of their business. When thinking about the evolving industry of social data, it’s important to know that everything your population is saying about you is critical. It is absolutely mission critical to your business to see every Tweet, read every blog mention, see every comment about your business. You never know when the impact of that social activity will become viral. Consider the “United Breaks Guitars” video or the sponsored Tweet by @hvsvn for a complaint about British Airways. Those are both aggressively seeking resolution within social media, but there are countless others that aren’t as creative that can impact your business. The importance of my message really surrounds the old adage of, you don’t know what you don’t know. What happens if you think you’re paying attention to what everyone is saying about your brand, your company (your livelihood?) and you don’t see it?

Social data has completely become part of the critical IT infrastructure that needs to deliver in realtime, all the time. Some of our most successful partners are employing a wide variety of data sources into their product solutions so that they can see the whole picture, and monitor with confidence. At Gnip we refer to this as the social cocktail, well, mostly because we like cocktails, but also because it’s a blend of a varying set of ingredients that make a complete product. As my buddy Dave Heal points out in his latest blog post, they also want reliable, certifiable data that they can count on. You can’t rely on the timeliness of scraped data if you’re building any type of engagement product, so you need that reliability of a firehose in your infrastructure. If a complaint over Twitter goes viral and you don’t get notified in your system because of delays, you can imagine the impact. Sales leaders can’t go into meetings without the knowledge that something went viral about their product and marketing can’t respond to bad PR if they didn’t know something negative hit the wire.

It’s sometimes hard for big corporations to turn the ship on a dime, but it’s impressive to watch them change their product lines to digest streaming data from multiple publishers, adjust their rules on the fly, have the vision to ask for historical insight into social data so they can plan forward, and help drive industry change through the Big Boulder Initiative. When you get a phone call from your customer asking if you can help them build better signaling in their systems to alert them of the one activity in the 4 billion social activities Gnip delivers a day, it puts that missing text about picking up bread in perspective.