Posts

Showing posts from January, 2014

Untranslatable

The ease and reasonable accuracy with which Google Translate translates words, sentences and paragraphs sometimes makes us forget how hard it is to translate entire books. So why is translating books so hard? The usual “culprit” is context: the background, cultural practices and accepted/expected behavioural norms of one language (and by extension, region) don’t apply in others. Context aside, some words have no equivalents in other languages. But contrary to what you might think, this article on “true untranslatability” says the issue is not the lack of one-to-one words: “The common trope that language X has no word for Y is usually useless (it usually means language X uses several words instead of one for Y).” It cites this great example: “shockingly specific single words in other languages like mamihlapinatapei, which is apparently Yagan for “the wordless yet meaningful look shared by two people who desire to initiate something, but are both reluctant to start.” Bu

Newspapers, Don’t be Story-Centric

Way back in 2006, Adrian Holovaty wrote the following prophetic advice for journalists: “Newspapers need to stop the story-centric worldview.” On the face of it, this sounds nonsensical. Isn’t a journalist’s job to collect information and then write the story? Holovaty explains: “So much of what local journalists collect day-to-day is  structured information : the type of information that can be sliced-and-diced, in an automated fashion, by computers.” He went on to emphasize that he was not talking about things like making the article suitable for digital formats because that would just be “changing the  format , not the information itself”. Rather, he was advising them to focus on the raw data itself. But the typical journalist reaction to that was: “Displaying raw data does not help people; writing a news article does help people, because it's plain English.” Holovaty was really asking that the story aside , the raw data too should be captured in that newspape

Think Illogically

We know about the Google’s and the Facebook’s, companies that went from zero to tens of billions in the blink of an eye. But have you wondered how the venture capitalists, the people who fund such companies when they are starting, decide how to pick them? Which companies to fund, which ones to ignore? Paul Graham from the venture fund, Y Combinator, described the very interesting and, of course, counter-intuitive way of thinking (if it was intuitive, would we be asking the question?) that is needed. Most of us are risk-averse: we dislike losing more than we like winning. So if asked to invest, our first instinct is to ask, “Is it likely to succeed ?” The right question for a venture capitalist, however, is: “Is it likely to succeed really, really big ?” Which is very hard, because as Graham says: “You have to ignore the elephant in front of you, the likelihood they'll succeed, and focus instead on the separate and almost invisibly intangible question of whethe

Time Travelers

This is a blog of 2 time travelers, one from the past and the other from the future. First, the one from the past, the one Tim Wu describes : “A well-educated time traveller from 1914 enters a room divided in half by a curtain. A scientist tells him that his task is to ascertain the intelligence of whoever is on the other side of the curtain by asking whatever questions he pleases. The traveller’s queries are answered by a voice with an accent that he does not recognize (twenty-first-century American English). The woman on the other side of the curtain has an extraordinary memory. She can, without much delay, recite any passage from the Bible or Shakespeare. Her arithmetic skills are astonishing—difficult problems are solved in seconds. She is also able to speak many foreign languages, though her pronunciation is odd. Most impressive, perhaps, is her ability to describe almost any part of the Earth in great detail, as though she is viewing it from the sky.” And so, says W

Networked Lenses

Image
With the advent of the smartphone, the digital camera became ubiquitous. And soon photos of cats, sunsets, breakfasts and selfies began to rule the day. No wonder then that every time Instagram goes down, news sites have fun with lines like “Breakfasts everywhere went undocumented” or “Instagram is down – just describe your lunch to me” ! But has the smartphone-Internet combo really ruined photography? Isn’t Alex Furman’s point on the “immensely valuable and often overlooked” aspects of photography just as true today: “It forces you to see the world around you in a completely different way. It teaches you to find beauty and impact and symbolism in places that most people wouldn’t grace with a second look. Photography teaches you to pay attention and to appreciate. It’s about seeing much more than it is about capturing what you see.” What did the smartphone-Internet combo change? “Now we have legitimately capable cameras in our cell-phones, complete with filters and functio

Purpose of Religion

Most scientists, especially in the Western world, are atheists or agnostics. The history of the Church obviously contributes to that situation, the poster child for which is Galileo. But far more than that, I think it’s a philosophical clash between the approaches of science and religion that makes scientists pick the former over the latter. As Richard Feynman described it: “How a scientist can take a mystic answer I don’t know because the whole spirit (of science) is to understand.” Feynman elaborated on what would happen when you applied the most basic of scientific principles (to doubt, to question) on religion: “Once you start doubting, just like you’re supposed to doubt, you ask me if the science is true. You say no, we don’t know what’s true, we’re trying to find out and everything is possibly wrong. Start out understanding religion by saying everything is possibly wrong. Let us see. As soon as you do that, you start sliding down an edge which is hard to recover from

When News and Ads Mix

Image
A couple of years back, my dad had commented about the tendency of the news media to “be irresponsible, sensation-mongers, emotion-whippers, a gang that succumbs to the rot of commercialization”. Of course, things have just gotten worse since then. Many ads look exactly like content: there's even a term for it, the advertorial. There's nothing new about this (magazines have had such sections for ages), but now it's becoming hard to tell the difference between an ad and content since even the font, layout and background looks like regular content. Check out this Economic Times advertorial from 2 years back to see what I mean: Hard to say which is the advertorial in the pic above, right? (It's the one titled “Move Towards ‘CoreFirst’ Competence”). And the digital world just followed that trend. Last week, the New York Times unveiled its new website design with some articles labeled as “paid post” and a blue line for demarcation. But Emily Bell worries :

Marx and the Information Age

In Greek mythology, Procrustes was the giant who stretched or cut his victims to make them fit his bed. Today, the eponymous term refers to a person who imposes conformity without concern for individuality. This, of course, is what Adam Smith’s capitalism meant in the Industrial Age. In his TED talk on keeping people motivated at the workplace , Dan Ariely describes how Smith’s capitalism placed emphasis on efficiencies, which meant breaking down things into steps, and then creating specialists for each step. Eventually, everyone becomes a cog in the wheel and doesn’t know the big picture, or how they contribute to the final output. At which point they stop caring. Fast forward to the Information Age. Now worker alienation can be costly. As Ariely describes it: “And you can ask yourself, what happens in a knowledge economy? Is efficiency still more important than meaning? I think the answer is no. I think that as we move to situations in which people have to decide on their

Tech Moves @ Warp Speed

There was a time in the technology industry, not very far back, when companies dominated for long periods, often decades. Like IBM. Or Microsoft. Then the Internet arrived, with its “power vested in me by nobody in particular” philosophy, as stated by Marc Andreesen, founder of Netscape! And the rules of the game changed forever. Neil Gaiman described it as follows: “The rules, the assumptions, the now-we're supposed to's of how you get your work seen, and what you do then, are breaking down…The old rules are crumbling and nobody knows what the new rules are. So make up your own rules.” As Joshua Topolsky wrote , today, even 6 years feels like an eternity in technology:  “Only six years ago, the vast majority of people had no idea that a phone could do anything other than make a phone call. The concept of "apps" did not exist at all. The idea of using a phone as a GPS unit or a camera was laughable. Twitter and Facebook were nascent distractions. Netflix

Time Zones Insanity

Image
Every time the US shifts to Daylight Saving Time (they move the clock back and forward by an hour once each year), it is a big nuisance for me: all my evening/night calls with the US start (and end) later. In her article , Allison Schrager starts by criticizing Daylight Saving Time everywhere it is practiced: “It’s a controversial practice that became popular in the 1970s with the intent of conserving energy… It also creates confusion because countries that observe daylight saving change their clocks on different days.” She then goes one further and proposes reducing the number of time zones in the world and cites China as a country that has one time zone though it spans 5 time zones geographically (I would India to the list): “America started using four time zones in 1883… Now the world has evolved further—we are even more integrated and mobile, suggesting we’d benefit from fewer, more stable time zones. Why stick with a system designed for commerce in 1883?” After all,

Erasable Internet Now an Option

Most of the stuff on the Internet is permanently stored, in the servers of any one of the other major companies. The downside of course is that anything you said or any pic you posted (or in which you were tagged) can be used against you…forever. Which is perhaps why Farhad Manjoo called the chat app, Snapchat, “the most important technology of 2013” , only half tongue-in-cheek. So what did he find so revolutionary about the app? “Snapchat is one of the first mainstream services to show us that our photos and texts don't need to stick around forever.” This is what Manjoo calls the “Erasable Internet”. This is almost heresy to the Silicon Valley orthodoxy that “data is inviolable” and that “you can never have too much data”. The “delete” option is almost always an afterthought (if such a thought even occurs) in an era of dirt cheap memory. Costs aside, this is the age of Big Data: and so if a company doesn’t have data, well, what would it crunch? Web entrepreneur Anil

The Key to Every Lock

Pretty much every security system of the digital age is rooted on a mathematical assumption , that certain classes of problems will take forever to solve (even by the fastest classical computers of the future, but not quantum computers) while at the same time any solution can be checked very quickly. A maze is a good example of such problems: depending on complexity and size, finding the path from A to B could take forever, but any solution can be checked very fast (in comparison). Here is the risk: what if that assumption is wrong ? What if a generic solution does exist to solve such problems quickly? In fact, there is a $1 million reward for anyone who can prove whether or not this assumption is correct (in mathematical lingo, this is called the “ P versus NP problem ”). In an episode of Sherlock , the modernized (and awesome) BBC version of the detective, his arch-enemy, James Moriarty says he has found a way to break into any security system anywhere in the world: “I c