I’ve been following the Buzz about Buzz today (click on the link – get it?), and, wanting to try it (since I’m not of the privileged few bloggers given access at launch), I started browsing on my iPhone where I heard it was available. Immediately I was presented with a list of people following me that I was not following back, so I went in and clicked follow on about 300 or so people that it said I was not following yet. Big Mistake.
Later in the day I went to check Google Reader, which until today was my RSS Reader of choice, and lo and behold I had over 400 items from just the last hour sitting in my unread items box. It turns out when you follow someone on Buzz, it also follows them on Reader, and who knows what else on the various Google properties. Now, the only way to bring my volume of repeat RSS shares from friends down on Google Reader is to go into each and every one, mark hide, and manually move each into their own separate folders. All this on an already slow Google Reader interface. I’m not looking forward to that.
I have been critical ever since the Reader team introduced social features into Google Reader. Now, rather than being a place where I can just go to ensure I’m getting the latest news from the blogs I want to subscribe to, as a traditional RSS Reader should be, I’m now stuck in a world with hundreds to thousands of shared items from friends, many of those repeat items, getting fed to me over and over again, even when I don’t want them! Add to that all the likes, comments, ability to post “status updates”, and more, it occurred to me today that Google Reader is no longer an RSS Reader – it is now a Social Network!
I wish Google Reader would just stick to what it’s good at – being an RSS Reader. I now need a place I can go just to get the news I want and don’t want to miss. Some say those days are gone, but it’s still a need for me. Today with the introduction of Buzz, Google Reader became useless to me. If I want to skim the news I can go to Buzz and get all the features of a social network. I don’t need Google Reader to do that for me. But when I just want to read the news I want, Google Reader has lost its use for me. Maybe some of this is the reason Google Reader’s former team lead just switched to the Youtube team?
I’m first to admit RSS is far from dead, though I think it’s time to find another RSS Reader. Should I just switch to Mail.app? Where can one go to get the news these days?
Every time I switch to jQuery, Yahoo’s YUI libraries seem to keep luring me back. Just yesterday, Yahoo added one more tool to its arsenal of YQL libraries that actually makes the Twitter API intuitive, giving me another reason yet to switch back to yui, or at least consider using Yahoo a little more as I develop tools for the Social Web. The new YQL set of tables for Twitter enables any developer to use simple SQL-like queries to retrieve and post Twitter data.
For simple user queries, getting a user’s twitter profile data is as simple as something like “SELECT * FROM twitter.status WHERE id=’8036408424′;“. To insert data, you simply need to provide the oauth consumer key and secret, along with the user’s oauth tokens and you can do things like post new status updates for the user, all in Javascript! A subsequent call to post a user’s status would look like:
The cool thing about Yahoo’s YQL Twitter interface is I can also choose to only pull specific information out for the user. I’m not quite sure the benefit this gives you considering Yahoo is probably still retrieving the entire subset of data from Twitter (you can’t pull specific pieces of data out of specific objects in the Twitter API), but at least it’s possible, something I’ve been craving from the Twitter API for quite awhile. It is unclear if Yahoo is caching this data, and if so, it could provide some significant performance benefits, with Yahoo doing most of the work on their own backend.
Yahoo’s YQL puts them one level above Facebook’s own FQL query language for accessing Facebook data by enabling developers to not only access data like this for Twitter, but also other environments like Facebook as well. Yahoo has an entire database of “community tables”, where, if specific APIs aren’t provided, the community can create their own tables to that interface and give developers immediate access to those APIs via a simple, standardized SQL interface to those platforms.
This type of API is exactly what I was looking for from the likes of Google’s Friend Connect APIs (and Google has still failed to provide) – a standardized platform where one single API gives me access to all the different APIs out there. Now with standardized SQL I can access almost any API, and if that API doesn’t exist yet I can create my own interfaces into each API that, once created will also have access via that SQL interface.
Yahoo now has my attention with this launch. The API has a web interface, where a call as simple HTTP GET to http://query.yahooapis.com/v1/public/yql?[query_params] returns an entire structure of XML data my application can access. They provide a YUI Javascript interface into the table structure so you don’t need a backend if you don’t want one, and I get all this for all the APIs I interface with.
I will now be looking into the Yahoo APIs as I look to interface the limitless APIs available out there thanks to Yahoo’s focus on cross-platform integration of their YQL interface. I like that Yahoo isn’t being selfish with this. With YQL, Yahoo has finally created a glue that lets me access all the APIs I need to access as a Building Block Web brick builder.
A common complaint amongst Twitter developers has been that Twitter’s OAuth, the authentication process you see when you click the Twitter login button on a 3rd party website and go to a Twitter-looking page with a “Allow” or “Deny” button, is too complicated. Mainly, from a user experience perspective, users are required to leave the 3rd party site completely in order to log into Twitter, then get redirected back to the 3rd party site again. If anything breaks along the way, the user is left wondering what to do, and valuable logins, purchases, or registrations could be lost. Facebook has solved this by enabling users to do all the login process via Javascript they provide that produces a popup. Users can log into Facebook without ever leaving the 3rd party site. It appears, based on a thread on the Twitter developers list, that Twitter is planning to one-up Facebook by allowing users to log in to 3rd party sites without ever even needing a popup or any type of redirect, and they’re already testing it with select partners.
The topic came up when other developers noticed that the site, TwitPic.com, was allowing direct Twitter logins right on their own website and somehow posts from TwitPic were showing up with the TwitPic name and link next to the post on Twitter. This normally isn’t possible without enabling OAuth login because Twitter has disabled the functionality for any non-OAuth produced Tweet. In fact they have said in June of 2010 they will be completely removing the ability to login through Twitter on 3rd party sites via plain-text authentication. So how is TwitPic doing it?
According to Raffi, an Engineer on the Twitter API platform team, Twitter is currently working on a new “OAuth Delegation” standard that will allow applications to allow users to log in via Twitter on their own sites, while still maintaining the control over Apps that OAuth gives providers and users. So, on TwitPic, for instance, you can log in to TwitPic.com with your own Twitter username and password right on the TwitPic site itself, yet you’ll still have full control on Twitter.com to revoke access to TwitPic at any time you want to. In addition, Twitter, at any time, can remove TwitPic’s ability to publish or access the Twitter API since they still have to use OAuth to make Twitter API calls.
If the hints in the developers list thread prove true, developers will be able to take the plaintext username and password, still store them somewhere, but in order to make calls through the Twitter API they’ll have to somehow send an OAuth key with their requests to Twitter along with some way of identifying the user. My guess is, in essence, the app will send a one-time login on behalf of the user to Twitter (most likely via a secure SSL encryption channel or similar), and Twitter will return to the app an OAuth token to make API requests with on behalf of that user in the future. In my opinion, this is still no different than storing an OAuth Token in a database that would give apps the same access as their Twitter username and password.
Security Concerns
While storage may be no different, I’m sure there will still be those concerned about this approach. For instance, what happens when users get used to entering their Twitter usernames and passwords on 3rd party websites and decide to do so on a malicious website? We’ve seen how used to entering Twitter credentials people get with websites that look like Twitter itself with the rampant phishing attacks recently.
Maybe Twitter is feeling comfortable enough that they can be proactive about such misuses and password collection. The risk is still there though and hopefully the OAuth Delegation Twitter is getting ready to launch will cover this problem.
Partners
Thus far, it seems TwitPic is one of the partners testing this new delegation standard Twitter is working on. Several others were mentioned in the developer discussions about this as well. For instance, Seesmic Look is also taking similar credentials without any OAuth redirect, yet still shows the “Look” source in Tweets generated with the app. One developer pointed out the information that could be retrieved from the new requests, and the security of it all is a little concerning.
Whatever it ends up being, the winners will be desktop and mobile client developers. Right now developing a mobile or desktop app involves deep integration into the browser in order to legally get the user logged into the app. It is why we see so few native desktop clients and so many AIR apps. AIR is a browser-based solution.
I’m very interested to see what happens. The Twitter team is supposed to announce more details very soon and I’d like to find out more about what this means for developers, how secure it is, and how much recoding I’ll have to do to enable it in my app. Whatever it is, you can bet it will be one step simpler than the currently more-simple solution which Facebook provides. This is getting very interesting! Let the API wars begin…
Recently here in Salt Lake City we had the opportunity to have Eric Schmidt, CEO of Google visit. While I didn’t have the chance to see it, reading about it, he seemed to talk about a common worry I hear throughout this State. Here in Salt Lake City and around the area we have a lot of successful businesses! From my Uncle’sFreeservers.com, to Omniture, to Mozy, to Novell, Wordperfect, and many others, there’s no shortage of success in this area. It’s a hotbed of talent and technology the world doesn’t give enough credit for. The problem is that we have no Yahoos or Googles or Facebooks or Microsofts to give us credit for that success. We have no home-grown success story that didn’t eventually sell out for big bucks to one of the big West Coast companies. I think this is a common problem for many areas. Why is this?
Eric Schmidt tried to come up with his own reasons in response to Utah Senator Orrin Hatch, who (Hatch) stated, “We get a corporation going and it has some tremendous ideas and all of the sudden someone comes up from Silicon Valley and buys it and takes it back there.” Schmidt responded, saying, “I don’t know whether [improving the situation means] globalizing the business. I don’t know whether we need more venture capitalist presence in Utah or maybe just more experience building the businesses from the startup. It’s not that businesses aren’t getting started, it’s that once started they aren’t growing the businesses fast enough.” So what is it that keeps the Googles or Microsofts from staying in Utah (and other states) rather than staying here and growing to compete with the big guys?
I’ve suggested the PR problem before. That’s just one problem Utah has – a lack of enough tech bloggers to get the word out to Silicon Valley. One other common problem I see in Utah is we get greedy. I’m not even saying that’s a bad thing. Too many Utah startups are focused on the money rather than an underlying cause that motivates their revenue stream. That’s part of the reason Utah businesses have been successful – we have some of the smartest business people in the world right here. Even Eric Schmidt confirmed that, stating that “Utah is one of the best places to do business.” We know how to make money! Unfortunately that’s what differentiates us from the West Coast companies like Google however.
I argue it all revolves around cause. Let’s look at Eric Schmidt’s company itself, Google. Everything they do centers around one central cause, “Do no evil”. It doesn’t even matter if they have purpose. Everything they do must be done “the right way”, even if they lose money from it. Some even argue this has become a PR pitch for them as well. Google is willing to lose money for their cause, yet they are also making money because of it. It’s an amazing strategy.
Facebook also does this well. I’ve done a lot of work with Facebook with 2 books on the company and several apps written around their platform. When you interact with them and their employees, you get a common theme from them: They are doing all they can to enable people to share in bigger and better ways. Their vision is to help you share without risking privacy. Everything they do revolves around that – their revenue model is built around their cause.
Twitter is building “the pulse of the internet”. They want to enable better communication between anyone in the world. They’ve forgone revenue to ensure that takes place (yet they’ve been able to raise a ton of capital, I realize, but I argue that’s part due to their cause).
I see the same thing from company to company in the Bay Area and even up in tech hotbeds like Seattle (home of Amazon, Microsoft). These guys all drive revenue based on purpose! While there are currently a few exceptions, I don’t quite see this in Utah and other states, especially amongst the larger startups. It’s all business.
Eric Schmidt also stated that “It’s not an attitude problem, it’s an availability problem. To me, it’s recruiting new talent into the state and growing new talent. It’s really people and expertise and that’s the way to make it happen.” Guess what drives and keeps talent? Motivation. If people have cause to work for they come, and they stay, and they work hard at it. I remember at BackCountry.com (a Utah company), our mantra was “We use the gear we sell”. Employees loved that because all kinds of incentives were given to get employees using their cool gear, and the employees loved that!
80% of Utah’s population is in the Salt Lake City area. Schmidt suggested this was an incredible opportunity for people to connect. I think we just need motivation to encourage that connectedness. Motivation is what makes the Googles and Facebooks and Microsofts of the world.
If you’re a startup, anywhere, what are you building on top of? Where are your foundations? Are you building for money or for purpose? I know as I build my business I’m going to be thinking much, much more about changing the world and less about the money I make as a result of that. The money will come naturally. That is how you build Google, and keep it there.
What’s your cause? What businesses do you think do this well? Please share in the comments.
EDITORS NOTE: 2 Companies in Utah that I think are doing really well at this are Phil Windley’sKynetx and Paul Allen’sFamilyLink. When you interact with them you can sense their cause. It bleeds through the company. People are sacrificing time and money just to be sure their cause is getting through. As a result, Paul Allen’s company was recently ranked one of the fastest growing companies on COMScore, and recently, according to Compete.com, surpassed his old company, Ancestry.com in traffic. Cause eventually pays off! I encourage you to learn what they do – they won’t be going away any time soon.
Marketers seem to never learn. Time and time again they have tried to sacrifice loyal relationships with customers in order to take the easy road in hopes to get the small percentage out of millions that might convert into one-time sales. Affiliate marketing is ripe with these people hoping to “get rich quick”, without regard to how it is done. I some times wonder if these people would sacrifice their own souls in order to gain a quick buck. It would certainly seem so as we have been inundated with junk mail and e-mail spam, viruses, worms, porn, and other tools intended to spread what they’re selling to mass audiences in as fast a manner as possible.
Technology has sought hard to stop such problems. We have anti-virus solutions that stop the malware, but evidently it’s not good enough, because viruses and worms and malware still spread. Google’s Gmail has excellent spam filtering software for e-mail, as do other services such as Yahoo Mail and Hotmail. Yet, I still get spam e-mail. There are even services which try to stop the amount of junk mail you receive, yet even that isn’t fool proof. It seems no matter how much technology we throw at it, the spammers will always find a way to circumvent the process.
Government is doing all they can do as well. Here in the United States, CAN SPAM act makes it easy for government to prosecute against spammers. The act was meant to thwart the problem in the early 00’s where e-mail spam was running rampant. The marketers all complained, claiming it would reduce the amount of money they could make, worries of economic crisis ensued. But after the act went into place, marketers began to realize they were actually seeing more money than before because they were actually focusing on people that were interested in their product, rather than people that weren’t. I admit a lot of my spam went down at that time.
Enter 2010. Twitter is almost a standard. Facebook is almost a standard. We are seeing the era of micro-messaging take form, and it doesn’t seem this era is going away any time soon. As with any new communications technology, so come the spammers that come along with it. As I can attest from my own company, the spammers are now out of control on Facebook and Twitter and almost any other service that enables micro-messaging, and they’re fighting their best to stay on top of it all. I admit they’re probably doing all they can, too.
On SocialToo in just the last month, we have already automatically marked near 3,500 DM messages as spam out of a total of 3,500 users that utilize the service. Since we implemented the service just a few months ago we’ve marked near 8,500 DM messages as spam. And that’s just DMs on Twitter! Considering there are in the 10s of millions on the service and DMs aren’t the only means of spam, you can see the problem Twitter and Facebook are facing.
It was this reason I added these spam filtering services on top of SocialToo. I too want to do what I can to help kill these problems. I’ve seen it all – even people abusing my own service to increase their numbers and in return spam those followers with things their followers never intended to receive. It was this reason we complied with Twitter’s request to remove automatic unfollow of those who unfollow you recently, and frankly I agree with Twitter on the move – they’re doing the best they can to thwart spammers, and I want to support them in that process. Look at this video I found on Youtube recently – in it, a man is demoing software that uses a combination of your desktop and outsourced workers in India (likely through services like Amazon’s Mechanical Turk) to quickly create accounts, send a few tweets each to increase, gain, and grow followers, and spam those followers with affiliate links. It’s appalling the way he says this is a “secret” only a “select few” marketers know about – the fact is I already knew about it – it’s no secret:
This guy’s software is just one of many, and I argue it does this the hard way. Now we have the ability for applications to sit on top of the browser and completely control the context which a user views the web. Applications like GreaseMonkey, extensions and plugins, and even Kynetx, while they can be used for good, could all be used in this way with just simple HTML and Javascript to create accounts and spam with them. There’s not much Twitter or Facebook or even the makers of GreaseMonkey, Firefox, Chrome, IE, or Kynetx can do about them (although Kynetx at least has a controlled user directory through which they can at monitor these things). There are already tools like Hummingbird out there that do this for relatively cheap, and there will be more.
It’s time Government step in and put an end to this. CAN-SPAM was written for long-form communications, but it needs to be modified to allow for the short-form. It specifically mentions e-mail and cell phone communications, not micro-messaging services. Recipients should still have the opportunity to opt-out of the messages they receive. Perhaps the enablers of such communication such as Facebook and Twitter need to provide a means for message senders to provide an opt-out location that attaches to their messages. That’s just one idea – I’m sure there are many other ways of doing this.
CAN-SPAM needs a provision which specifically targets the micro-messaging space. It needs communication which specifically says what marketers can do on these services, and how people can opt out. As I know very well, this will not stop all messages, but it will cut off a large majority of messages, which I know are being used by legit Lawyers and Doctors and business owners everywhere in the US to cost Twitter thousands of dollars and waste the time of countless people.
We need to do all we can to stop this nonsense. I want to see these micro-messaging spammers prosecuted. It won’t happen unless the US Government modifies CAN-SPAM. How can we do this effectively in the micro-messaging space?