Flashback time. To the last months of 2004. It was the final semester of my computer science engineering undergraduate course. A few more months to go before I stopped using anything I had learnt in the past four years of college. Final semesters tend to be unhurried. There is usually just one course. The rest of the credits are for a ‘project’. Most get themselves an internship opportunity somewhere. Some others try and do something at a marquee institution like IISc under a professor of good repute. My two teammates and I, however, took an unusual route. We did nothing for the first 99% of the time, and then when there were a couple of days left, we coded out a desktop app. (It does sound like how some startups work, doesn’t it?)
The app was called Invest-igator. The name was a horrible pun. That one needs to investigate before one invests. It was essentially a stock market portfolio picker, but it would pick based on an implementation of fuzzy equivalence matrices. There were some basic concepts of neural networks thrown in too, and the idea was to show that the portfolio picked would do slightly better than one picked by experts (which was essentially distilled from some tattered copies of business magazines). My teammates and I waved our hands convincingly enough to confuse the project evaluators into giving us good marks.
Based on the anecdote above, you can see that we were very lazy and just a wee bit clever. I wish we were more clever though.
For if we were, we would have named the app AI+nvestigator. We would have claimed that our bag of tricks was cutting-edge artificial intelligence, and therefore any use of the app would result in untold riches to the users of the app, and of course, us. But then, AI was not as fashionable then, as it is now. Sadly, like most things fashionable in the world of technology, not everything termed AI today is actually AI. Artificial Intelligence is not a new field of study. It has been around as an academic discipline since 1956. And ever since, there have been numerous achievements, both at a theoretical level and in practice.
What has happened over the past few years though, has been a radical increase in computational capacity. Storing data has never been cheaper, and collecting data never been easier. This made a lot of old AI concepts suddenly cheap and practical, and brought forth great and useful results. And before we knew it, a fad was born. And everyone wanted to ride its coattails. Even if your data was a hundred rows in a spreadsheet, and all you did was sort them by one of the column values, what you did now was artificial intelligence. A recent conversation with a friend, who works as a consultant at one of the largest IT services companies, revealed a syndrome of this. Every client of theirs suddenly was talking about wanting an AI solution to their problems. With very little understanding of what AI is, and whether it is necessary even for them.
I am not saying that every startup out there that claims to be doing AI is out to con you. But there are enough who are. And even if their solutions are genuinely the cutting edge of AI research, one will do well to remember this Neal Asher quote from one of his novels, “The human mind having been produced by selective insentient evolution, then created Artificial Intelligence, which initially remained distinct from its makers. It is hypothesised that imperfect minds cannot create perfection, because flaws will always be introduced.”
In this weekly column, we discuss the startup workplace. The writer heads product and technology for an online building materials marketplace