Posts

Showing posts from February, 2014

Hemmingway v/s Hemmingway

There’s this app called Hemmingway that evaluates your composition and grades it on a scale from 1 to 10. Its aim? “Hemingway makes your writing bold and clear.” It shows its results by color coding the text you entered: -          Yellow for “long, complex sentences”; -          Red to mean the “sentence is so dense and complicated that your readers will get lost trying to follow its meandering, splitting logic”; -          Blue for adverbs: “Get rid of them and pick verbs with force instead”, the app recommends; -          Purple where a simpler word might do e.g. “use” instead of “utilize”; -          Green to indicate use of the passive voice. (You could disagree with some of the points in some contexts, but surely as a rule of thumb, most of them do make sense. And the app is free, so what do you expect?) Keeping in mind that the smartphone generation may not be all that much into long form writing (texts and tweets may be their preferred mode), the app has

The Path to Amazing

Getting started on any creative activity is so hard. Mostly it is because we feel the pressure to come up with something good, if not great. We tend to believe the myth that great works just happen…whereas the reality is that they evolve over multiple iterations. Which is why Anne Lamott suggests that we feel free to “write a very, very horrible first draft”. Which sounds illogical at first: if you know it’s crappy, why write it at all? The answer is simple: to overcome the inertia; to get some momentum going. In case of writing, Shawn Blanc says it helps to know one of the myths most of us have about writing: “I wait to get started because I assume that if I don’t write something magical and clever as I’m typing it for the first time then I certainly won’t be able to improve upon it in the editing and re-writing process.” But of course, that’s not true. You can improve the draft later. Seth Godin puts it very well: “[T]he only path to amazing runs directly through not

Filtering Photos

My wife doesn’t like the fact that I delete many of the pics on the phone very soon after I take them (I delete even more after I download them to a computer’s bigger screen). And it’s on this filtered content that I crop, zoom and apply the photo app and application filters. Why do I purge so many? Because let’s face it: most of the pics are average or worse, and certainly not the kind I would look back at years later fondly. I want to avoid what photojournalist Kenneth Jarecke describes: “Instead of having a body of work to look back on, you’ll have a sad little collection of noisy digital files that were disposable when you made them, instantly forgotten by your followers (after they gave you a thumbs up), and now totally worthless.” Most people, of course, don’t do that. They just post anything and everything they clicked and applied a filter on onto Facebook or Instagram. After all, isn’t What-I-had-for-breakfast the most common theme on Instagram? I agree with Ted Nym

Immortality

I am a huge fan of Feynman and agree with so many of his views, and am blown away by how articulate he is and how a scientist can think like an engineer (i.e., be practical)! But these lines from one of his lectures in 1964 surprised me: “It is one of the most remarkable things that in all of the biological sciences there is no clue as to the necessity of death. If you say we want to make perpetual motion, we have discovered enough laws as we studied physics to see that it is either absolutely impossible or else the laws are wrong. But there is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable.” I was surprised by Feynman’s ignorance on this topic, so to speak. Let me elaborate. Life rarely deals with individual anything (it is either about a combination of cells to form an organ, or a combination of organs to form an individual, or a combination of individuals to form a group/species). Further, pretty

The Printed Book Lives On

Image
Citing the Association of American Publishers data , Nick Carr pointed out that the explosive growth of e-books sales in the US (doubling or much more every year between 2008 and 2011) seems to have not just slowed down, it seems to have come to a halt (it was an anemic 5% this year). Note that Carr is not saying that e-book sales have stopped; just that the growth in sales between successive years has slowed. In fact, as Carr points out, the share of printed books in the US continues to fall, just not as fast as before. E-books currently account for 25% of total book sales. Carr’s list of suspects for this apparent e-book plateau includes factors like the medium not suiting certain types of books (“like nonfiction and literary fiction”) or certain reading positions (like “lying on the couch at home”); the existence of multipurpose tablets that distract you from reading; and there not being much of a price difference between printed and e-books. And, perhaps, it also ha

Not a Science

During the 2008 financial crisis, central banks all over the Western world pumped money to bail out the very financial firms that were responsible for the crisis. As John Dickerson scathingly described : “Risk is supposed to be about choice and consequence. You take a chance and you win or you lose…companies that helped cripple the financial system were repaid by the government bailout. They took a chance, and lost—but they still won.” Now contrast the above with how the British government responded to the infamous Irish potato famine in 1845: they ascertained the high severity of the issue and then decided to do nothing ! Why? It was a time when Adam Smith’s theory ruled Britain with an iron fist. Felix Martin in his book, Money: The Unauthorized Biography : “Adam Smith had proved that it was allowing private self-interest to operate as freely as possible that most efficiently achieves the social good…Within seventy years of its publication, Adam Smith's theory of moneta

Paratext

In a world where the TV screen isn’t the only screen in town, where the tablet, the smartphone and even the good old PC/laptop screen vie for attention, you could expect the TV show industry to do something to hold on to their viewers. So what did they do? Well, they’ve accepted the reality that the “two-screen experience” (or even more screens) is here to stay. TV shows in the West even ask you to “log on to the show's website and participate in real-time chats and interactive razzle-dazzle”, as this article by Thomas Doherty says. The term for this “decorative wraparound material”? Paratext. The longer definition? “The paratext is the satellite debris orbiting and radiating out from the core text: what the post-­telecast chatfest Talking Dead is to The Walking Dead, what Madonna-vs.-Lady Gaga mashups are to the original music videos, what Wolverine action figures are to the X-Men franchise—what all the buzzing swarms of trailers, teasers, bloopers, tweets, swag, webisode

Toast to the Villains

The latest season of Sherlock was terrible: the attempt to make Holmes a bit more human fell on its face; the plots were non-existent; and most important of all, there was no James Moriarty. As this article says : “Indeed, a truly great superhero comic is defined by the quality of the villains. Batman has the Joker; Superman has Lex Luthor; Spider-Man has Doc Ock; Iron Man has the Mandarin. In each of those cases (and many more), the best villains have become just as legendary as their corresponding heroes, and are almost always more interesting.” To this list, I would add: Sherlock Holmes has James Moriarty; and Harry Potter has Lord Voldemort. In any case, aren’t they like the yin and yang? Or as Moriarty puts it: “We're just alike, you and I, except you're boring. You're on the side of the angels.” Great villains don’t just liven up “banal stories about (heroes) saving the day and getting the girl while simultaneously standing for truth, justice, and the Ame

Game Changers

I have played a fair amount of Atari video games as a kid, but I never became a game junkie/addict. And yet I found this Michael Thomsen article on cheating in video games very interesting and thought provoking. But first, let’s be clear what does not constitute online game cheating. In many online games, you could play them the old fashioned way (advancing by your skills). Or you could pay the game maker to get extra lives or to increase your farm produce or to skip levels! (In case you’re wondering, most such games are free and such purchases are the way the game makers make money. It’s just a different business model). But at least those games were designed with such options in mind and everyone knows that some of the others would be employing such techniques. So what about games where cheat codes were never intended to be part of the game? Your instinctive reaction about using such cheat codes might be, to quote Johan Huizinga: “as soon as the rules are transgressed