Why Tablets Will Fail (If We Don’t Fix Them)

(Note: This was partially written a few months ago. Times have changed a bit, but tablets are still terrible in my opinion. I have to pat myself on the back for guessing that the Xoom and BB Tablet would fail, although it WAS a fairly safe bet.)

If look at the pipelines of major technology companies, you will see one item on all of them, tablets. Granted, it’s probably a good idea right now because tablets seem to be the hot new toy. Many tech columnists are saying that it’s the future of computers. Though they also acknowledge there will always be that segment that needs high performance. The problem with tablets right now is they don’t deliver on their promises. Tablets are supposed to be lightweight, portable, and most of all, easy to use. Since the dawn of computing there has always been a push to make it simpler. I’m going to pick on the iPad here because they have such a large part of the market and they somehow made a previously dead market come alive again.

I’m going to start with what Apple got right, app management and price. Geeks who have used Linux/Unix package management for years probably wonder why it took so long for a mainstream consumer OS to catch on. Apple did it right and you can easily download, install, and uninstall applications with a finger press or two. No install wizards where you have to choose directories (this is also a downside) and you have less chance to make mistakes. Price is where Apple also got it spot on. Normally Apple over prices all their products, lovingly called “The Apple Tax” by many, but at $499 for the base iPad model it creates a decision for consumers that use their personal computers lightly. They can decide to spend the same amount of money on an iPad or buy a cheap laptop that has a much more flexible operating system. Apple did a brilliant job marketing the iPad since many consumers with enough disposable income just buy both. As Steve Jobs had said before though, it creates an unnecessary device between laptops and desktops that consumers truly don’t need. Every other company coming out with a tablet is piggybacking on the marketing communications coming out of Apple and aren’t developing any new use cases for the tablet devices. I can’t fit the iPad in my pocket and like the iPhone or iPod, I can walk around staring at a 10″ screen instead of a 4″ screen walking into people and doors I don’t see.

Other tablets are going to fail way before the iPad for numerous reasons. The Motorola Xoom will fail because it’s too expensive and Android isn’t as mature as iOS, so customers won’t see as much value in it. The Blackberry tablet will fail because it requires a separate device to access email and have 3G access. I don’t care if it’s a wireless Bluetooth connection to a BB phone, it’s going to be hard to convince enterprise customers to buy a tablet instead of laptops. If you’re just monkeying around with email, all of BB’s phones do just fine and people have grown accustomed to that. On the flip side, say you need to type up a business proposal. Fat chance you’re going to type that on a touch screen. Also, unless you have giant hands, the 7″ still requires two hands to hold steady. Now that we cleared those competitors away, lets get back to the iPad.

The iPad’s interface may seem so intuitive, but in all honestly, it still doesn’t address some of the problems some computer illiterate face (namely a lot of the older generation). I’ve found this with a lot of people who don’t completely understand how computer interfaces tend to work. When I tap a link in Mobile Safari and it opens up a new page, the other page goes away. Now, it may seem silly…it’s still there, but I’ve seen people freak out and don’t have a clue where the page they just had open went. Also, there’s no way to restore recently closed tabs in the interface. Sure there’s history, but a lot of times I see people closing tabs to “hide” them and then they wonder where the heck they went. That’s just an example in Safari which is probably one of the most used applications on iPads. Another big problem is file saving. While there is an argument to not have a file browser, it cosmetically limits what the device can do. You can’t just save Word document from an email and then access it offline from the iPad. We still have a long road ahead of us in terms of usability and HCI, but I think the interfaces can be improved significantly, but not limit the power of the device. This is the older generation though, we have a new generation of kids who interact with cellphones before calculators.

I was talking to teacher at an event I was at and he made an interesting comment about the first graders he was teaching. They were teaching them how to use a basic calculator (those blue TI calculators that always seemed to go missing by the end of the year) and instead of just pushing the buttons with one finger, they started typing with their thumbs and were holding them up to each other and “texting” each other numbers. The next generation will even more so be able to handle the complexity of these interfaces so what would they have a use for these “simplified” interfaces? They’re being exposed to it at such a young age, they won’t have any problems with them. They’re not going to need or want a device like an iPad or tablet. They’re going to want the power and flexibility a laptop provides and as they’re getting ever slimmer with SSD technology and ever smaller processors so portability won’t even be a factor. For almost the same price and size in the future they’ll be able to have a laptop with a physical keyboard (anyone truly like a virtual keyboard…even with haptic feedback?)

I hope the market matures and we’ll see better tablets in a few years, but for now I’m going to stick with a regular laptop and desktop.

Advertisements

Is RSS Dead?

A few months Steve Gillmor posted an article called Rest In Peace, RSS. He has this notion that the real-time web will take over and supersede RSS. If there’s one thing that I can be sure of, it’s that RSS is never going to go away. The biggest problem is that websites like Twitter and FriendFeed are single companies, but RSS is a protocol. A protocol is a general set of rules and RSS isn’t controlled by a single entity. The big question is, what happens if Twitter or FriendFeed fails? They are seemingly becoming large companies, but they aren’t public companies and rely on private investment. Unless they start becoming cash cows within the next few years, they’ll be thrust out of existence and a thing of the past.

This isn’t to say the entire idea of the real-time web is completely defeated. There are new technologies like Google Wave that emerging, but they’re still in a testing phase. There is no telling whether it will catch on or not. Although the problem of a single point of failure arises again. Twitter has been down a lot, but it most likely doesn’t affect 99% of the people in the world with internet access. If people start piling onto a single service, it creates a problem. This probably has even reared it’s ugly head in the enterprise as companies increasingly move their applications to the “cloud”.  The recent outage at Google shows that no matter how large a company is and how large it’s infrastructure is, there are still failures.

Of course failures will still happen on separate systems, but it’s a contained failure. If I get all my news through Twitter and then Twitter goes down, so does all my news. RSS allows for separation and if one feed goes down, it’s not an apocalypse. I have to admit that I’m now somewhat of a hypocrite though. I use NetNewsWire on the Mac for my news feeds. The new version syncs with Google Reader. I believe Google Reader has it’s own service that goes out and crawls the feeds and puts them in the reader. So instead of pulling directly from the site, there is an abstraction. I wish it pulled the feeds directly, but NetNewsWire is a great client and even if the feeds can’t update, I still have a nice list I can view on my computer of all the feeds I’m subscribed to and go there separately if need be.

Another problem Steve mentioned was information overload. He doesn’t want to parse through all the information. The problem is that the real-time web will eventually get like that. If you follow too many people on Twitter you’ll be getting updates faster than you can read them. I have about 50 feeds in my RSS reader and I currently follow 113 people on Twitter. I’m fortunate that many of them don’t post a lot otherwise I would probably have to stop following them. Newspapers have spoiled us in a way. They have editors that filter through a lot of news and gives us what they think is the best. Going onto the web, your friends and you are the only filter. The world is a large place and if you try to view all the raw news, your brain would explode.

Real-time and RSS both have their separate places. In my mind they serve different purposes and one will not be taking another over. I’ve always loved the fast pace of innovation in technology, but some technologies are so pervasive and deeply rooted in society that they’ll never leave. RSS is one of those technologies and it is not dying nor will it ever be dead.

PS: Just to note, there’s a button in your browser to subscribe to my RSS feed.

Shared Hosting

While working on the site for The Recorder (Central’s newspaper) I’ve had to deal with what we were going to do for hosting. CCSU currently runs all club websites on a Microsoft IIS server. There isn’t even database access so we’re left with either static pages or crazy ASP flat file systems. Neither of these was what The Recorder wanted. Their current system was ASP and seemed as if it was on the brink of collapse.

After a quick check of prices we came down to GoDaddy. I use GoDaddy for all my domain names, but I never thought of having sites hosted there. None of my personal projects were “mission critical” so I thank ZoomCities for hosting them for free on a fast VPS. I can’t put a major site on a free hosting provider though. I wasn’t involved in budget discussions, but they settled on the 24 month Unlimited plan. It’s $13.49/month or $323.76 for the whole shebang. Obviously there are limitations to that. There are physical hardware limitations, but one thing I can’t get over is the speed.

Since it is an “Unlimited” plan I bet a lot of people host a lot of illegal files and are sucking the bandwidth of the shared server, but it seems overly ridiculous. Look at these statistics that compare ping time with my site (dweitz.net) and the recorder site.

--- dweitz.net ping statistics ---
10 packets transmitted, 10 packets received, 0% packet loss
round-trip min/avg/max/stddev = 15.594/17.816/20.278/1.380 ms

--- therecorderonline.net ping statistics ---
10 packets transmitted, 10 packets received, 0% packet loss
round-trip min/avg/max/stddev = 97.226/98.067/101.293/1.112 ms

I haven’t forgotten about speed of light differences though. GoDaddy’s servers are in Scottsdale, AZ while the server I have my site hosted on is in Ashburn, VA. Those two sites are about 2,203 miles apart from each other. The speed of light is 186,000 miles/second. So that means data can travel at a theoretical limit of 186 miles per millisecond. So every 186 miles we can add 1ms difference and not fault the server. For 2,203 miles that would be a difference of 11.8ms. So for round-trip that would be 23.6ms added. So if that GoDaddy server was in Ashburn the response time would be in the area of 73.626ms round-trip minimum. That’s a difference of 58.032ms between the two. Even if you slow down the routes between here and GoDaddy it wouldn’t be enough to bridge that 58ms gap.

I can’t blame GoDaddy though. A lot of service providers oversell their shared hosting. 1&1, Bluehost, and Dreamhost just to name a few. If you want to go to a dedicated virtual server it starts at $50/month most places and that’s a basic package. You get a lot more value add with a shared hosting package. Most people are going to be stuck with shared hosting still though, unless you have enough advertising or a massive budget to pay for dedicated hardware.

I hope I get FiOS in my area soon so I can build a server and run it from my house just for the heck of it. Having a server in my kitchen was one of my childhood dreams. I had messed up dreams as a kid.

(PS: First Post of 2009, w00t!)

Data Security

Over the past few days many stories were written that show critical systems aren’t as secure as one might think. I just read an article on Slashdot called Most Companies Admit Their Data Is At Risk and it’s no wonder there are many stories about identity theft. Health care records, financial data, social security numbers, etc. are all at risk. A lot of that risk can be reduced though.

The biggest problem I see is a lot of data is out there and accessible on the Internet. Spending more on IT security is all fine and dandy, but many risks stem from uneducated employees who accidentally leak data onto the web. The most secure system in the world is no match for the idiot who has access to the Publish button.

The Large Hadron Collider’s network was attacked recently as well.[source] While it was a simple hack, the LHC IT people revealed that the crackers were one step away from a computer controlling part of the LHC. I don’t even know how this can happen. Who thought it would be a bright idea to stick those computers on any network that was accessible to the outside world? Computers that control ATLAS and the other experiments should be on their own network. That network should not have any physical connection to the outside world.

The company I work for recently had a breach as well. It wasn’t a computer system though. It was a CD containing information such as names, social security numbers, and other personal information. It was suppose to be sent to the state for some tax reporting purposes. The disk never got there though and I got a piece of snail mail about two months after it happened that laid out the situation. I was offered the chance to sign up for a two year LifeLock account for free, courtesy of my employer. Since I now have a LifeLock account, I can safely say my Social Security number is…why should I tell you? Spending money on IT security wouldn’t help in this case, but I wonder why the state still runs it this way at all.

IT security can always continue improving, but it’s the people, not the hardware or software that is to blame most of the time. If a system gets breached because the adminstrator set the password to ‘admin’, who is to blame? The IT administrator of course. The system is obeying what the human is telling it. Artificial Intelligence is just that, artificial. A system can learn over time, but that’s just the wisdom of the crowd. The solution to all of this is to shutdown the internet, but unless you want to live in the stone age, then I would advise against pulling all those plugs.

Classic: Learning A Programming Language

This post was originally made on January 29th, 2007 on another blog I had.


You may not be a geek, but I’m sure you have always been curious as to what all that stuff on the screen of programmer actually means. You first need to understand that programming isn’t an easy task. Also, it requires dedication and the curiosity to learn it. Here are my thoughts on learning a programming language.Like I said in the introduction, this stuff isn’t easy. You may have heard of some magical programming language that is incredibly easy to learn, but it’s just a gimmick. To give an example, there is Visual Basic. By the title, you can guess that it is fairly simple. I have news for you though, it isn’t. Back in the day, BASIC was easy to learn, but no one programs in that anymore. Visual Basic is probably one of the easiest languages to learn, but it still requires reading a couple of books to really gain some experience in it.You shouldn’t get confused between markup languages and programming languages. HTML, the angled bracket tabs that allow your browser to display everything, is a markup language. HTML is easy to learn and isn’t used to program anything. If you really want to get into making programs for the web, you need to learn a web programming language.

Web programming languages are vast in number, but there are a few quality one’s that you should take a look at if you are interested. One of them is PHP. This site is actually powered by PHP. It isn’t the best by far, but it is the most widely used one. There is also Perl and Ruby. Those three are actually called server-side languages which means that they are parsed by the server before they are displayed to the user. Client-side programming involves Javascript mostly. Since Web 2.0 busted out, Javascript has had a second coming in the form of AJAX web apps and such. There are many libraries to help you out and if you’re interested in making all that glam, buy yourself a book and learn the basics.

There is also application programming. The majority of game development now takes place with C++. This is a complex language that may take years to master, but it will certainly be rewarding in both your sense of accomplishment and your salary. There is a program or collection of programs from Microsoft called Visual Studio. These components rely on Microsoft’s .NET Framework. That’s where Visual Basic is and also Visual C++ and Visual C#. Those are the most common ones. Visual C++ is different from normal C++ though. So you can choose either one or the other. When you use C++ to develop a game for example, you usually use a graphic library. You have probably heard of DirectX, but there are also others, like Qt.

Also, if you’re interested, there is programming for micro controllers. This is what I do on my robotics team and can be a pain, but it’s fun. When you are working with them, you have to take into account the memory limitations and capabilities of the microprocessor. If you are programming for a normal computer, you are able to use floating point numbers, on a microprocessor though, you would probably fry it. Microprocessors can be found anywhere from your car to your toaster. Someone has to program them, why not you?

I hope I gave you insight into the wonderful world of programming and maybe you want to get started on a project of your own. It can be frustrating at times, but that’s when you need to try your hardest to understand it. Sometimes you really have to push yourself because the programmers who made the languages usually make them for functionality, not ease-of-use. You can find programming books in most bookstores or online at Amazon. If you don’t have much money, fire up Google and find some tutorials.

Future Standards

I just read Kyle Neath’s article titled HTML5 and CSS3 are doomed for disaster. I took away a lot of information, but it kind of stated what a lot of people are thinking. Making specifications is easy compared to the implementations of it. Although I think HTML5 will be a bit different.

The specification for HTML5 is being created with browser developers as part of the process. Since they have an idea of how hard it will be to implement each part of the spec, it should go smoother. Also, HTML5 is an incremental upgrade. It’s nothing as radical as XHTML and provides several neat enhancements that will make developing for the web easier. XHTML is currently just HTML with self-closing tags in most implementations. It’s a struggle to try and design with standards when they aren’t even supported and you’re just left with “tag soup“.

Being a good web designer is partly being able to develop a design that works across multiple browsers and platforms. This can truly be a pain when you’re talking about trying to get your code to work with IE6. How I yearn for a scenario when we can just tag team IE6 and no one would create sites that are compatible with IE6. This would force the user to upgrade to a browser with a least some standards support. Our entire school system just made the switch from IE6 to IE7. If our school system with hundreds of computers can make the switch, any small company or personal user should be able to also. All they have to do is click the install button, and then it works (not 100% of the time, but hey, nothings perfect). I by no means advocate IE7 to people, but if it’s the only option, so be it.

The Flash Player 9 has been downloaded over 3.2 billion times.

The reason why Flash does so well is that it comes pre-installed on a lot of machines and it can automatically update from within the browser. As I have said before on other blogs, people are stupid. They are also a bit impatient. If they have to go out of their way to install another browser or upgrade to say, IE7, they aren’t going to want to. Hitslink provides a glaring example. IE still controls about 78% of the market with IE6 still ahead of IE7 by 4%. IE7 was pushed through as a priority update on October 18th, 2006 [source]. That’s over a year and it’s about even percentage-wise. Now look at the Firefox trend. Firefox 2.0 has 15% of usage while the previous iteration, 1.5, has .63%. Since then, Firefox has updated automatically within the browser itself and it’s updated when the browser is restarted. No fuss for the user and most of the people that use the browser are on the latest version. Flash just upgrades and installs and people get the latest version without them having to do much except maybe click a link to automatically download it.

I digress. For all my hopes, they may just be dashed. I want to be realistic about it, but I so desperately want standards to become, well, standard. Making it so the user doesn’t have to do anything would be a step in the right direction.

Google Chart API

I love graphs. Numbers are incredibly boring when they’re just sitting there on a table or spreadsheet. Making numbers lively requires some imagery. That’s why I love what Google has done with their new Chart API.

They make it dirt simple for anyone to create graphs. Instead of making a bunch of data and saving it out and realizing that you screwed a number up, you can just go change a number in the URL and it changes instantly. No need to upload all the images, just use the <img> tag and link to the URL with it.

If I figure it out more, I think I’ll make a little script in PHP that will auto-generate URL’s. Then it won’t give you any excuse for not using graphs when you have numbers to show! This is just an example of what it can do:

http://chart.apis.google.com/chart?cht=lc&chd=t:10.0,80,20.0|30.0,8.0,63.0&chs=250×100&chxt=x,y

VoIP and Precautions Killing Bandwidth

There were many changes to our school this year. Two of them were not welcome. This was the reduction of bandwidth that we created by our new telephone system, and a new phishing filter installed on the routers.

To give you a background, our school network connects to the internet on two T1 lines. Last year, you could look forward to achieving around 70kbps download speed on most sites because of the volume of users and students using it during the day. At night, you could look forward to going around 190kbps.  Since the phone system was installed, you can expect to have around 20kbps or even slower during the day. I could connect via a modem and get better download speeds than that! This is especially frustrating in my networking class since we take our tests and read material online. Some of that is flash-based content which is just horribly slow. Since the phone system is constantly running, it doesn’t get much better at night.

Another slowdown is the phishing filter. I don’t get the reasoning for this one, but it’s affecting the system, so I figured I would talk about it. Every single request for a page has to go through the phishing filter which takes a second to analyze if your only serving up a page, but when it’s looking at all the packets being requested from everyone in the school, that number rises to 100s of pages per second. The result is waiting at least 10 seconds before the pages even start to load. They also “upgraded” to IE7, which has it’s own phishing filter, and its own problems. I usually turn it off, but to those who leave it on, it just makes it that much slower.

I guess VoIP is the way of the future, but it has to be used responsibly. Our IT guys made an assumption about the amount of bandwidth it would take up, and it has cost us. The phishing filter doesn’t make any sense since people shouldn’t be using their credit cards and getting scammed on school time anyways. I don’t mean to blame it on the IT guys because they were probably pressured to reduce costs by installing VoIP and the phishing filter is the result of people’s actions. That’s all for now, see you in the future!

Will The Internet Die?

Ever since video became a reality on the web, there have been concerns over whether or not the web can handle it. While the internet is a mesh network, it does have some singular points of failure. The root DNS servers are located in close proximity to each other, so if they ever go down, no one will be able to get anywhere unless they know the IP address. Politics is also a big concern as we move foward.

Internet politics encompasses many debates. I’m not talking about a tiered internet, but that is bad enough. I’m talking about disagreements between the Tier 1 ISP’s. It wasn’t that long ago that you couldn’t access half of the web because Level3 had a disagreement with Cogent and they cut off access. It has since been resolved, but if this happens more often, it will cause more disruptions. Also threating the web is not just qualms between companies, but their unwillingness to upgrade their networks. Core fiber and other essential networking that keep the internet running aren’t being upgraded. Although people will transfer around one exabyte of data next year, the big corporations such as AT&T defy the rest of us. In case you don’t know, an exabyte is over one billion gigabytes. They can barely handle it at this level, what makes them think they can keep going this way?

You can also look at the way we have large websites run. A lot times they are run at a co-location which may have good uptime, but a few minutes of downtime can mean money for some people. There have been two examples of this recently, Rackspace and 365 Main, and I wouldn’t be surprised if we see more of this as more web sites pop up. The general idea of grouping a bunch of web sites together in one facility is ironic; since the beginnings of the internet were made around the idea that if one place goes down, a bunch of places don’t go down at the same time.

Don’t be surprised when you try and go to Google.com and it doesn’t work. I hate to say I told you so, but it’s inevitable the way it’s going.

This Year

I look back at the last major post I made on this blog, and think. It has been over a year and a lot has happened online and in the tech community in general. I’ll start with what is powering this very blog.

WordPress has gained an even more massive audience than last time I wrote. The community at large has released a few a versions of this wonderful software. I have attempted to switch over to Movable Type or some other alternate blogging system, but I keep coming back to WordPress. I really don’t know what it is, but whatever it is, I hope they continue on the same path. I look at this from a user’s perspective too. As a programmer, I see a few parts that could use a little overhauling, but most of it is just a matter of taste.

If you go back a few weeks, you will encounter the release of that $200 PC from Wal-Mart. I do have some qualms about gOS, but I’ll save that for another post. The system itself is solid, and it will help people enter the digital age with little impact on their budget. The only problem I foresee with such a system is that the ineptness of the consumer will create a help desk nightmare. Say someone goes out to buy Microsoft Office for their brand-new PC. Guess what, it won’t work since it’s not Windows that’s installed, it’s Linux. I personally know plenty of people who don’t know what an Operating System is, and would be completely pissed off when they realize they can’t install a piece of software that they just paid $400+ for. Seeing how this is a budget PC though (which comes with OpenOffice preinstalled), most people who are buying the PC won’t go out and pay twice as much of the cost of the PC on a single piece of software.

No recap would be complete this year without a mention of Apple, Inc. From new iPods, to the iPhone, to Leopard, Apple has made a major play in the tech market. It’s been covered ad-nausea in the media, so I won’t repeat what they’ve already said. I’ll just say that they all look cool, and if I had the money, I would buy them.

Google has also been doing fairly well this year. With the recent release of its Android mobile platform and breaking $600 a share, it’s looking up at the Googleplex. Google hasn’t been immune to all attacks though. From their attempted buyout of DoubleClick, to take-down notices from the studios against YouTube, and Gmail exploits (partly due to a jar exploit in Firefox, which is supposed to be fixed in 2.0.10), no one can say it’s been easy-going.

On the security side of technology and the web, we have the Storm botnet. The botnet is estimated to be really freakin’ huge (that’s just an estimate) and can DDos sites and take down entire networks, all with a single click. The main problem is that people are curious. They are also inherently stupid. When guys, in particular, see an image that says click the bunny to see what’s behind it, nothing with stop them from clicking that goddamn bunny.

Going to the gaming division, we had a few big announcements this year. The big games were Halo 3 and Bioshock. I’ve played both, and I love them. The console battle has clearly been won by the Wii (still can’t find any in stores), but the Xbox 360 also turned out well. The PS3 has suffered from over-engineering and a lack of good games. I have a feeling the PS3 will do better once MGS4 and Haze get released on the system.

So many big events happened this year. If I could, I would cover them all in this blog post, but that would make for headaches and stiff fingers. So until next time, stay classy Internet.