web analytics

Just "because you can" is no reason to be a dick October 22, 2009 at 3:29 am

I don’t normally blog on other countries’ politics because I don’t want to seem like an arrogant American, but as a consumer of media and a user of the Internet, not to mention a computer scientist and a hopeful future content creator (and, well, because I’m bored right now), I gotta jump in to the debate raging across the pond in England in regards to music piracy and how to deal with it. The recent deal announced that will target illegal file-sharer’s Internet connections and reduce them “to a level which would render file-sharing of media files impractical while leaving basic email and web access functional” (after two warning letters) is such a stunning invasion of privacy and overreaction that I don’t even know where to begin.

First, let me say that I don’t pretend to think that just because it’s possible to very easily share media files over the Internet without compensating the original creators that it should be categorically allowed. As someone who hopes to create the type of content (films, in this case) that is daily massively shared around the Internet I see the problem. I do. For every torrent of Bones or whatever downloaded, that’s a few cents those awesome artists aren’t going to see. So, first point can be summarized thusly: just because you can download it for free is no reason to be a dick to the artists.

Secondly, show me the post offices where they open our mail and make sure we’re not sending illegal stuff or making illegal plans via letters and I’ll support this plan to track Internet users’ traffic and “shape” it. Show me the highways where there are roadblocks (outside of Iraq and the Mexican border, I mean) to check for people smuggling illegal goods and I’ll support this plan to cut off users who’ve paid for broadband service. Show me the phone company that taps our phones to make sure we’re not using the wires to do illegal things and I’ll support this plan to tap every broadband connection in England to listen for file-sharing traffic.

ISPs have one job: deliver us an Internet connection, and in return we pay them a ton of money. It’s a good agreement for both parties (usually) and in return both parties keep out of each others way for the most part. That means the ISP doesn’t tap my wires looking for “bad” traffic and I PAY THEM EVERY MONTH. If I want to game 24/7, that’s between me and Blizzard. If I want to download from iTunes all the time, that’s between me and Apple. If I want to illegally file-share, THAT’S BETWEEN ME AND THE RECORD COMPANIES. I can’t think what the ISPs involved in this deal are thinking. It’s got to be massively expensive (though less than opening all our mail, checking all our car trunks or listening to all our phone conversations), and for what in return? A big bag of money from the record companies? I suppose they’re hoping this will noticeably lessen the traffic on their networks, but the users are paying for that bandwidth. Plus, if they really succeed in cutting down the Internet connections of 7 MILLION PEOPLE, isn’t that gonna kinda make that 11% of the population really, really, really, really PISSED? I don’t know what over a tenth of the entire population of a country being pissed because they just lost their broadband looks like, but I can’t imagine it’d be pretty. So, second point in summary: just because it’s technically possible to track and attack people’s Internet connections is no reason to be a dick about it.

I don’t really have a lot of ideas for fixing the problem, but one thing I do know: the Internet has changed the game and nothing (NO THING) is going to stop people from sharing. What we need to do is redefine the model from one of paying for the actual content to one of paying for what’s around the content. This is already what happens on TV. TV viewers don’t pay money when watching on the TV, but they do pay (in time) for what’s AROUND the content: the 17 minutes every hour of advertising. The small field of web comics also embraces this model: the actual comic is free, but fans pay by looking at ads on the website, buying books, t-shirts, hoodies, coffee mugs, posters, etc. and by making the occasional paypal donation. Only a handful of web cartoonists are able to live full-time off this model, but it’s an idea. The equally small field of web series’ is also actively creating a new way of making content free of charge. Most people who’re looking more than 5 or 10 years into the future understand that the current model is going to be dead and rotting very soon. The RIAA isn’t scaring enough people (and never will be able to) to reverse the ongoing evolution and (as Shakira puts it) “democratization” of the music and movie industries. I don’t pretend to have the answer, but I think at the most basic level it will involve the content creators getting closer to their fans and in a lot of cases the “middleman’s” role being drastically reduced if not completely eliminated. This is obviously what scares the RIAA and the like, but it’s really exciting to me as a fan (I’ve had personal interactions with some of my favorite cartoonists via Twitter) and as a hopefully future content creator. I think because this new model is going to be decentralized, it will take everyone each adding a piece to the puzzle to the BUILD the new systems of income for content creators. The world is full of awesome creative people. How long until someone who has not a lick of musical talent but who’s awesome at finding business models on the Internet teams up with a few budding musicians or bands and becomes their personal manager and helps them create a gainful income via a model that bittorrent won’t undermine?

It’s 3:30am and I’m getting really tired. I’m gonna leave this post here and apologize for all the typos (for which I know there are many, as I’ve fixed quite a few dozen already). I also apologize if this doesn’t make as much sense and/or flow as well as I think it does, because I’m pretty tired and might be reading it completely wrong.

‘k enough apologizing. Not gonna apologize for calling a ton of pirates dicks or for calling most British ISPs dicks. Both well deserved titles.


3D model lighting September 24, 2009 at 1:04 am

Frequent readers may recall I posted a video last spring of the little character I’d modeled in 3D for my 3D Animation II: Character Design class. This semester we students are using the same characters we made last semester and building little environments for them. First up to learn about is lighting, and we’ve spent the last 3.5 weeks on that so far. We’re trying out all kinds of different types of lighting on our characters, and I’ve rendered still frames of some of my favorite lighting environments so far. Read on…

First up is: setting/late evening sun
setting sun
(Click on any image for larger version.)

And the obvious follow-up to that is: dusk

I can’t remember what we were actually doing here and I’d done something wonky with the texture/color of the floor of the room (made it bright blue and reflective) so it came out differently from what the prof was expecting/wanting/demoing, but I liked the result, which I’m calling: ice palace
ice palace

Again I can’t for the life of me remember what my prof called this, but at least it came out the way he was expecting this time. For lack of any better name, I’m calling it beaming fire
fire beam

I’ve really enjoyed looking at lighting so far (despite what I’ve said to my family on the occurrence of not being able to simply take a walk at night anymore without noticing all the different colors of lights) and hope you, fair reader, have enjoyed seeing some of my favorite lighting scenes as well. We’re also working on building whole environments for our characters (I’m modeling a circus tent) and I’ll try to post pictures of that sometime, as well.

Cheers for now,

The unnamed "TD Six"/"TS Erika," or Where Is The NHC? August 31, 2009 at 1:17 am

First, a couple of disclaimers:

1. this is purely fun intellectual speculation. I highly respect the tireless work the amazing folks down at the National Hurricane Center do all year long.

2. The NHC has had this to say about the topic of this post discussion:


So it’s certainly on the Center’s “radar,” so to speak.

However, my informal analysis of the system, dubbed Invest 94L, tells a different story: one of a tropical depression about to become a tropical storm (Erika, soon?).

Let’s back up for a second and look at the definition of a “tropical cyclone” according to the NHC’s “Glossary of NHC Terms” a tropical cyclone is defined as follows: “A warm-core non-frontal synoptic-scale cyclone, originating over tropical or subtropical waters, with organized deep convection and a closed surface wind circulation about a well-defined center.

If we ignore the rather technical terminology in the first half of the sentence and just focus on the second half we can emerge with our two important pieces of information: organized thunderstorms and a closed surface circulation.

Here’s an infrared satellite image of “Invest 94L” about an hour ago:

We can see what the NHC means about minimal thunderstorm activity, but it is looking more organized than a few hours ago (here’s a loop, but it’s live so if it’s much past 1am Monday August 31 you won’t get much out of it) and since the official Center definition of tropical cyclone doesn’t declare how much “deep convection” (read: strong thunderstorms) have to be present we’ll just say that this qualifies.

Now for “closed circulation,” which is usually much harder to find and less likely to be present in “tropical disturbances” (read: areas of thunderstorms in the tropics) we can turn to NASA’s QuikSCAT satellite for data about ocean surface winds to begin our investigation of this issue: the satellite passes over any one spot about every 11 hours and so the data is sometimes dated (it also takes a few hours to post online) but there’s some data covering Invest 94L from 21:13 UTC yesterday (roughly 7.5 hours ago). (Again, that map will be updated with new data sometime on Monday so it might not be what I was looking at. Sorry.)

The QuikSCAT data tells us an interesting story: not only does Invest 94L have a closed circulation, winds at 5:13pm EDT were generally weak, but there was one measurement of 30 knot winds, just 4 knots below the definition for tropical storm force winds (a buoy located near that measurement reported winds of 23 knots and a gust of 29 knots at 2am last night). And as I noted earlier, this system was looking much less organized several hours ago, when that measurement was taken. By now the storm not only has “organized deep convection” but also has a “closed surface wind circulation” with winds of very near tropical storm force. I feel certain that what’s out in the Atlantic, only 48-72 hours away from affecting the easternmost islands of the Caribbean according to some forecasting models (click on “Storm 94″ and again, the data is live, results may vary, blah, blah, blah), is already Tropical Depression Six and should be defined as such. I understand that the thunderstorms are not all packed close around the center of the storm, but the system’s organization over the past few hours has really improved to where it looks just like any other tropical depression: somewhat disorganized, but with the potential (under a favorable atmospheric environment, which the NHC says it has) to turn into a better organized storm, and probably Tropical Storm Erika.

I hope the NHC will do one of the following at the 5am advisory time:
1. start advisories on Tropical Depression Six, or
2. send an Air Force or NOAA Hurricane Hunter airplane to take a closer look at the system;

I should head to bed, fall semester classes start tomorrow….


Tropical Weekend: Ana, Bill…and Claudette?? August 16, 2009 at 3:31 am

Well, it’s been a big day in the Atlantic, and it doesn’t look like it will end any time soon.

You may have read my history of Tropical Depression Two (now Tropical Storm Ana) yesterday morning, and you may have further seen my tweet about the formation of Tropical Storm Bill (only 6 hours after being declared a Tropical Depression) on Saturday evening.

Coming on the heels of a completely quiet June, July and first half of August, this is all pretty intense. But it’s not over yet. Not only are both Ana and Bill heading toward land (Ana has prompted a Tropical Storm Watch for parts of the Leeward Islands and Bill is currently forecast to pass near the same area as a hurricane next week), but now there’s a new system brewing much closer to home: the Gulf of Mexico, just off the coast of Tampa, FL.

Dubbed Invest 91L*, here’s the latest from the Hurricane Center on this new system:


[*systems that aren't quite organized enough to be called Tropical Depressions but that bare watching are called invests and numbered from 90-99 (when they hit 99 they just go back to 90).]

(live IR satellite image of invest 91L)

Invest 91L is already looking like it’s spinning on radar imagery out of Tampa (and on infrared satellite, for that matter) and the radar presentation is overall fairly impressive (see image below). As the NHC noted above, the one thing holding this system back from being declared T.D. 4 (or even T.S. Claudette?) is the surface circulation. Not closed, not a storm. Closed, and the NHC declares it. It’s that simple.

(Click to visit UCAR for latest images.)

If the system does get a closed circulation (which I’m 90% sure it will by 5am EDT, when the NHC issues advisories and would likely start advisories on this system if deemed necessary) it doesn’t have a lot of time to mature before it makes landfall. The NHC has it moving to the NNW at 15mph, and most models bring it inland near the Florida/Alabama state line within the day.

Given the presentation of the system on radar and satellite my money is on T.S. Claudette by the end of Sunday, but we’ll see.

Gotta say, this weekend isn’t boring in the Atlantic!

Stay tuned and stay safe.


The crazy story of Tropical Depression Two August 15, 2009 at 3:54 am

OK, maybe it’s not THAT crazy, but I find it amusing, mostly how the National Hurricane Center is handling the whole thing.

T.D. 2 IR satellite image

Some history:

It’s August and we’ve yet to see a Tropical Storm in the Atlantic. Last year at this time T.S. Fay (the 6th storm of the season) was just forming. All in all, it’s been REALLY BORING in the Atlantic, and I’m guessing (being a weather/hurricane nut just as I’m assuming most of the NHC forecasters are) that the good folks down at the National Hurricane Center might be getting a wee bit bored?

So when T.D. 2 formed last Tuesday morning I imagine it was a bit of a celebration in the office (sometime I need to take a post and explain the fine balance between concerned for the safety of those in harms way and oh finally something is gonna happen I was about to explode this is so exciting I love storms even though I’m not a sadist that I think most severe weather nuts experience).

Almost 24 hours later (on Wednesday morning) the storm was looking so good that I tweeted my belief that at Advisory 4 (24 hours after the initial advisory) the system would become Tropical Storm Ana (with winds of at least 40mph).

Alas, all was not well with our newest Tropical Depression. By Wednesday evening, barely 36 hours after it formed, the NHC was reporting that it was basically dead but they kept issuing advisories on the system. This turn of events prompted me to create a twitvid on Thursday morning noting my amusement that the NHC was still issuing advisories on a system that they themselves were saying was basically dead.

Finally on Thursday evening, 24 hours after the storm died according to the NHC, they stopped advisories.

Later that evening I logged on and looked at satellite images of the storm only to be amused to see that T.D. 2 was playing cat-and-mouse with the NHC by visibly increasing in organization after the Center had stopped officially issuing advisories on the storm. By Friday afternoon it seemed obvious to me that T.D. 2 was determined to makes fools of the NHC forecasters who had (rightfully) pulled the plug on the storm the day before.

It took until 12:30am tonight for the NHC to decide that T.D. 2 was back and re-start advisories on the system. To the Center’s credit, according to the forecast discussion they waited for data from no less than four different sources (satellite images, surface wind data recorded via satellite, buoy data and data from a special NOAA jet equipped to fly over hurricanes and tropical weather) before reinstating advisories on the storm. Fool me once, shame on you, feel me twice, shame on the overeager NHC forecasters, right?

The current forecast calls for the storm to become a Tropical Storm on Saturday evening and to pass over or near the northeastern Caribbean islands on Monday.

I know weather forecasting is supposed to be all about science and hard facts, but I’ve seen enough crazy weather (including tropical storms that just screw with the forecasters trying to keep one step ahead of them) that I’m a firm believer in individual storms sometimes just having a mind of their own, and if the story of T.D. 2 so far tells me anything, it’s that this system might just be one of those storms.

Buckle up, it’s gonna be a fun ride.


You give love (and Podcars) a bad name August 11, 2009 at 7:54 pm

Most of my blog reads probably know of my interest in PRT/Podcars, so today I was surfing around on links off a blog post about PRT and I ran into a website called PRTProject.com. Naturally, I was excited. It’s always fun to see new and interesting websites about the concept.

When I stopped being excited and started understanding, for the first time, half of the arguments against PRT is when I read the second section on the main page. The part where the website author informs us that rather than compliment existing road-based transit (the car, the bus, the taxi, etc.) PRT replaces said systems.

Oh. My. F’ing. Goodness. I am truly panicked about the fact that, according to the site counter, 8717 people other than me have read that site and been so entirely mislead about an amazing transit concept as to turn entirely away from it for life (I’m assuming that part, because if I had no understanding of PRT I’d run screaming from the idea based on the information on that website).

I am not a huge car fan, but even I can’t fathom having it completely taken away from me in favor of tracks built into the street carrying publicly-run Podcars. My skin crawls at the thought.


OK, background for those confused: PRT is not meant to replace ANY type of transit, rather it’s a concept (that will be proven or disproven at London’s Heathrow Airport in the coming months) that solves quite a few problems that current transit systems don’t (the problem of how to get to a lightrail stop 5 miles away, for example, or how to get across a large corporate/university campus or a large airport or a small town that can’t support any larger transit systems*). Podcars are a beautifully scalable concept, but never in a million years will they, should they, or could they replace ALL cars on the streets of a town. (Masdar City being an exception to this rule whose successfulness is still to be seen.) The beauty of having lots of transit options is just that, OPTIONS. People who enjoy driving should be able to drive. A PRT system should not take that right away from anybody.

[*I used a bunch of small-scale implementation possibilities as examples because I think that's the way PRT will first prove itself, but I think it does hold the potential to cover an entire city. I just think that any kinks will need to be worked out on smaller systems first.]

For some reason (that quite frankly does escape me) it seems that PRT/Podcars is a much more controversial topic than other transit systems (although there’s quite a bit of bickering over HSR in the US, too, I guess) and that’s why I think that a website that blindly positions PRT as something to COMPLETELY replace automobiles, including the roads they travel on (without a word of what roads emergency vehicles [that’s not true, apparently] and other non-automobiles would use) is really dangerous (and naive). Anybody who really understands transit in America understands that you can’t take people’s cars away from them. You can give them more transit options so that those who don’t want to be tied to a car don’t have to be, but you just can’t remove the roads and expect people to take it. And implying that all versions of PRT (the website never stipulates that the idea presented is an extremely far-out version of PRT) involve taking away people’s cars, well, that’s just rude and extremely counter-productive to the cause of getting people to take Podcars seriously.

Badly done, PRTProject.com, BADLY DONE.

That is all.


Just say "no" to DDoS attacks August 7, 2009 at 7:56 pm

Do you remember what happened one year ago today?

Do remember what happened to Twitter, Facebook and LiveJournal over the last 36 hours?

Given the rumors that the DDoS attacks on these three social networks were carried out by Russian hackers/crackers I don’t think it’s too much of a stretch to assume that the DDoS attacks were linked with the one year anniversary of the Georgia/Russia war (that would be the eastern-European ex-soviet country named Georgia, not the state in the USA). Especially given Facebook’s report that only one person was being targeted in all these attacks, a pro-Georiga blogger going by the name “Cyxymu” on all the targeted social networking sites.

Is it too much to assume that pro-Russian interests wanted to silence this blogger on the anniversary of a war that, while probably the fault of both parties involved, seemed to make most people more angry at Russia than at Georgia?

Not really sure what my point is (and I gotta run so I can’t flush it out anymore), but I think it might be this message to whomever thought it would be fun to disrupt multiple online services just to attack one person: fuck you.


Connie Schultz and Craig Ferguson need to visit the mid-90s July 15, 2009 at 6:07 am

Remember the mid 1990s, when the World Wide Web was first hitting it big and we were all learning how it worked?

Apparently Craig skipped the section on search engines and how they work. Similarly, Connie seems to have missed the memo going around over the last few years that bloggers are not, in fact, out to get her and her job. (Although I think the New Media movement needs new terms for web publishers, because the term “blogger” can apply to everyone from emo kids with Live Journals to tmz.com writers to extremely knowledgeable college-dropouts to credited journalists writing online at newspapers or even on selfhosted sites. I fully admit that a large percentage of that group can not be taken seriously to report fact-based news. I also know for a fact that there are many people who proudly describe themselves as bloggers who do original research and news-gathering. To imply that that isn’t true is to admit a profound misunderstanding of Web 2.0 and New Media in general.)

Check out this interview from Monday night’s Late Late Show (the good stuff lasts no more than 2 minutes and starts about 1:15 in):

So much happens in those short minutes that Connie and Craig take to disparage bloggers and search engines that my head nearly exploded when I first saw it.

The thing that grates at me most is Craig’s bold and unilateral pronouncement that search engines are publishers and as such should be held accountable for any illegal activity on the pages they index and link to. While I suppose this is not an entirely unreasonable assumption for someone who’s never used the internet to make, it is very surprising coming from the likes of Craig Ferguson, someone who, while professing to not understand “The Tweety,” seems to know at least what the Internet and WWW is and also seems to have fairly good judgment about when to back off on a subject. This was obviously not one of those times. News flash, Mr. Ferguson: Google or Yahoo or Bing are not publishers by any stretch of the imagination. They are catalogs, analogous to a library card catalog: they simply tell the user where to find what he or she seeks, having no power over what that information is or who wrote it. The farthest (farthest) you could take your argument is that search engines should de-index a website that is illegal in some way. But to say that Google or Yahoo should be held accountable for the content of the pages they link to (content that can change at any time by no knowledge of the search engine until the search bot passes through the page again) shows such a stunning lack of understanding of the basic workings of the Web that I’m actually kinda sick.

And Ms. Schultz. “Bloggers…tend to take our work for free…” Let’s focus on that one quote for a moment. Could you be a little more specific? Who? What? When? The aforementioned emo kids are mostly blogging about their own lives, (something that could be considered original reporting, btw); the aforementioned tmz.com bloggers, well again, the paparazzi might be annoying to celebrates, but I’d have to go with “original reporting” for tmz-type blogs, too; college (or highschool, for that matter) dropouts, OK, that might be your crowd. But still, I’ve seen many an insightful post from such folks and rarely have I run across something that looks like stolen journalism to me.

OK. I should back up, admit that I’m being highly cynical here and admit that I understand (I think) what Ms. Schultz is saying: bloggers often “break” news stories on their blogs that newspaper journalists have already written and published, but to my mind, unless the blogger doesn’t cite/link to the original source (e.g. plagiarizes the article, such as it seems the company that Ms. Schultz references being in a court battle with the AP was doing) that is not “taking for free” it is simply spreading the news with proper citation. Further, my understanding of how most news bloggers operate (including myself) is that we pull in information from several different sources and present it in a (hopefully) unique way. If journalists want to take up the mantle of doing what unpaid (for a lot of us) people with laptops are doing, then be our guests. But I’m guessing there will never be enough journalists to do that, and there shouldn’t be. Journalists hold a certain place in our society, and I don’t for one moment wish less journalists in the world. What I do wish for is an understanding of the new ways of news gathering, analyzing and reporting. We still absolutely need news and information quality control, and that’s the next challenge of Web development, but especially in the new ways of news processing. Twitter, YouTube, Blogs. All are changing the way that we see and think of news, but a lot of it is not checked, and we absolutely need to find a model where all the news (or even most of the news) flowing around the web can be channeled through credible people, be it respected and trusted high-school dropout bloggers, or PhD’ed journalists. The old centralized news model of all reporters being AP certified and all news coming through the TV and newspapers is dying. Rather than lumping all bloggers with a company that was plagiarizing from the AP, journalists should be working with the blogging community to find the best way to help the news that is flowing around the new, seriously decentralized news model that is the Web, pass through credibly filters (the AP, trusted bloggers, etc.). But I think you will find, Ms. Schultz, that just lashing out at “bloggers” and talking gleefully about count wins over us is no way to win friends in the model that is the future. The way to change the way news is processed is not through court battles. It’s through cooperation and forward-looking thinking. Otherwish, the old journalist community is gonna be left in the dust, and the world will be a little worse off.

I’ve ranted for 1000 words now. I’ll quit now and go to bed. I hope my 6am post-Harry Potter watching rambles make sense.


Let's Talk About Antitrust July 8, 2009 at 5:41 am

I love Google. I’m not one of those people that worries about privacy issues with my searches or emails as I believe that Google actually follows, for the most part, it’s motto of “Don’t Be Evil.”

So naturally I’m really happy this evening to hear that Google is entering the Operating System Wars with Google Chrome OS which seems, at first blush, like it might actually take a good-sized bite out of Microsoft’s OS market share and therefore make the world a better place. (Setting aside my personal thoughts about Windows for a second, I don’t think any company, be it Microsoft, Google, Apple or any number of Linux companies, should have a 90%+ market share in any market.) Google Chrome OS seems like an awesome step into the future of computing, and while I think that it will probably never achieve majority market share, I do think it will do reasonably well, especially if Google pulls it off, which seems likely given the company’s past successes.

But for the first time ever, tonight I am worried about Google becoming too large. I like everything the company does, but just like I don’t like Microsoft on the principle of Windows owning the OS market, I worry about Google owning over 80% of the search market and making inroads into so many other markets as well. If Google does manage to pull off getting Chrome OS onto millions of Netbooks over the next few years and the OS catches on and starts pulling market share from Windows, will anything be different? My hope is that Chrome OS (which is built on the Linux kernel) will help pull down Windows’ market share and open the door for Mac OS X and other flavors of Linux to rush in, as well as Chrome OS, but I fear what it would mean if Chrome OS pulling market share away from Windows is all that happens. So if Microsoft’s Bing takes off and pulls searchers away from Google and Chrome OS pulls users away from Windows, nothing will have changed: two huge companies will still control the vast majority of OSes and web searches. Big deal.

I’m not an expert on monopolies and antitrust issues, but I finally think it might be time for someone who is to take a hard look at Google, its products, projects and goals and to see if Google might just be getting a little too big for its britches. If not, I’ll be happy. If so, well, I won’t be surprised.

Here’s to Chrome OS taking Windows down a peg or two*, and lifting all the second-tier OSes up in its place.


*Say, 20-40%? That’d give Windows a still very healthy 50%-70% share of the market.

Bing! Your annoying ads are ready June 30, 2009 at 3:11 am

I have a problem. It has to do with Microsoft’s ads for Bing, the rebranding/relaunch of Microsoft Live Search. This problem might just be bigger than my problem with the Laptop Hunter ads, which is saying quite a lot.

One of the Bing ads:

Here’s my issue: Bing is being billed as a “decision engine” and that it will reduce “search overload” (see video above), but I’m just not seeing it. I’ve done a few side-by-side searches on Bing and Google and I really see no difference between the results (doing a search for “huntington, in weather” turns up a weather forecast and current conditions from both search engines, but the current temperature is a degree higher from Bing, with Google‘s closer to what my home weather station is reporting. (In Bing’s favor, it does return Huntington, Indiana as the first result, whereas Google returns Huntington Beach, California as the first result with my correct Indiana city as second)). I assume Bing is trying to cash in on the recent media about Wolfram|Alpha (the only “computational engine” currently in the Internet) with the “decision engine” definition, but in my opinion it just serves to confuse as the software doesn’t seem to “decide” any more than Google or Yahoo what the information that you’re looking for is, and to even imply that it’s relatively easy to build a “decision engine” (which I can only assume is a cross between a search engine like Google/Yahoo and a computational engine like Wolfram|Alpha) cheapens the concept and insults pretty much everyone in the field of Information Technology, whether they’re working on developing such technology or not.

And then there’s my even bigger beef: “search overload.” What the hell is that? The only way to get search overload is if you get the wrong information than what you were searching for, and no amount of software is going to change the fact that if you do a bad search, you don’t get the information you wanted and, I suppose, it could lead to information overload, but it’s really up to the end user to become a smart searcher and not try to rely on the computer to read their mind. Look, the user knows what the user is looking for, right? The computer is dumb. It’s dumber than dumb. It’s a freaking idiot. It does what it’s told. Google, Wolfram|Alpha, Yahoo and all the other engines on the web have a lot of programming going into them to try to return the results that are most likely to be what the user wants. But it’s still based on probability because the computer can’t read the users mind. It can’t. Maybe in the far future when we’re all dead computers will be able to read the user’s mind, but for now it’s not possible. So instead of trying to pretend the computer can do something it can’t (“decide”? A computer doesn’t decide, it does what it’s told. It gets a search query, it does what the search software tells it to do. There’s no room for deciding, because it’s a set software program. If search is “A” then do “B”. It’s all very logical and metal and binary. Deciding is an emotional process, obviously there’s logic involved as well, but it requires thought and prior knowledge and all the things that make human brains different from every other “brain” (animal or machine) on earth. Simply put, the search software can include a near infinite number of if statements and it will never really be “deciding” anything, it will simply be following the set software routine and it will return the same results to the user no matter if they wanted to know about the cloud type, the band, the movie or the tool when they typed in “anvil” and hit “Search”.

So instead of making “decision engines” we need to teach users better searching habits. I don’t care if the user wants to use Google, Bing, Yahoo, Woflram|Alpha or anything else. It’s nobodies business what engine the user uses, because if we all became a little less lazy and a little more savvy about how we search, the world would be a better place, because we could finally take these annoying “search overload” ads and shove them in a dark closet in Redmond.

And one last thing: Google doesn’t shove as much information at me as Bing does: Google has no annoying (albeit pretty) picture with hover boxes on the main search page and it pushes the “related searches” information to the bottom of the page, not right up top on the side. Little nitpicks, I know, but that extra clutter right at the beginning of my search experience just further justifies my “your ‘search overload’ assertion thing is crap” argument.

Maybe if I tried Bing some more I’d like it. But I doubt it, as every time I see one of those ads I hate Bing I little bit more.