Saturday, January 25, 2014

The Apple Mac turns 30

A Look Back at 30 Years of the Mac
The Apple Macintosh computer turns 30 on Friday; here's a look back at what made the Mac special and how it evolved over the past three decades.
In the early 1980s, the home computer and business PC revolutions were already in full swing. Apple set the template with the Apple II in 1977, while competitors Radio Shack, Atari, and Commodore followed suit. Meanwhile, in 1981 IBM introduced an artificially crippled, open-architecture, 16-bit machine called the IBM PC, which when combined with Lotus 1-2-3, took off in popularity in business environments large and small.

It was the Macintosh, though, that set the course for both kinds of computing for the next several decades. While Apple didn't invent the graphical user interface, with the Mac the company brought it to mainstream consumers for the first time. Microsoft and IBM immediately began copying its various idioms and design language—at first with a kind of hilarious ineptitude, and then in earnest beginning with Windows 3.0 in 1990 and OS/2 2.0 in 1992. The rest, of course, is history.

Today, the 30th anniversary of the Apple Mac is upon us. It goes without saying we wish Steve Jobs were still around to celebrate with all of us. So with a nod to him—and Steve Wozniak, who started Apple with Steve Jobs, and to everyone who worked on the original Mac and what followed, let's take a look back and see how we got to where we are today. [...]
What a different world it was back then. I was a Commodore 64 user at the time; the Mac was just too expensive. And IMO, it still is. Sure, their stuff IS nice, but I just won't pay that much for overpriced proprietary hardware. If they ever reverse-engineer their software to run on regular PC hardware, I'd consider it but I won't hold my breath waiting for that to happen.

(I say reverse-engineer, because the current Mac operating system is about 80% BSD code. BSD is an open source, UNIX operating system designed for the PC platform. They had to re-engineer it to work on their proprietary hardware. Apple could easily reverse-engineer it to make it run on PC hardware if they wanted to.)

But there is no denying the massive impact Apple had on computer operating systems, especially the graphical user interfaces they used. Apple set the standard for the PC GUI. This article is a real Blast from the Past, looking at how it all came together for the first time.

San Francisco's "Tales of the City" Ends

Author Armistead Maupin ends San Francisco ‘Tales of the City’ saga with ninth volume
In 1974, when Armistead Maupin began writing what became Tales of the City, he thought of it as “an in-joke about the way life worked in San Francisco”. Four decades later, that in-joke has been shared by more than 6 million readers. His stories of interlocking gay and straight lives in the city constitute one of the best-loved of literary sagas. The New York Times described reading them as “like dipping into an inexhaustible bag of M&Ms, with no risk of sugar overload”. Now though, after four decades, that bag is finally about to be exhausted. The series will conclude with Maupin’s ninth book, Days of Anne Madrigal, published at the end of this month. [...]
I really enjoyed the PBS series based on the books. It really seemed to capture many of the special particulars, the eccentricities, of life in San Francisco in those days.


The real, post-election Mitt Romney

The New Mitt Romney Documentary Is Fantastic, And It Exposes The Fundamental Flaw In A Lot Of Campaigns
 One of Mitt's sons, Josh, was asked by Whiteley in the midst of the 2008 primary if he ever thought it wasn't worth the trouble to run.

Josh responded with two different answers — one from his media "training," and one that he said was the truth.

Here's the answer he gave as if he were speaking to the media:

"The opportunity [is] for someone like my dad to come in and run the country. And the challenges we face right now in this country, to have someone with my dad’s experience, his knowledge, and his vision for America, someone that can come in and do this. It’s worth whatever it takes for us to get my dad into office."

Here's the "translation":

"This is so awful. It’s so hard. They always say, why can’t you get someone good to run for president? This is why. This is why you don’t get good people running for president. What better guy is there than my dad? Is he perfect? Absolutely not. He’s made mistakes. He’s done all sorts of things wrong. But for goodness sakes, here’s a brilliant guy whose had experience turning things around, which is what we need in this country. I mean, it’s like, this is the guy for the moment. And we’re in this, and you just get beat up constantly."


“Mitt,” Al Gore, and Our Identification With Presidential Losers
[...] Many reviews of “Mitt” have noted its humanizing effect on Romney: he is revealed to be thoughtful and gracious and, in scenes with his family, funny and self-aware. There are even murmurings that such a portrait, had it been released before the election, would have helped him to shed his reputation as an ambitious automaton and to forge a closer connection to voters. Maybe he would have won. But, in the heat of a campaign, the documentary would have been greeted differently, as a purely political object—mined for ready clues to his political positions, spun predictably by supporters and detractors. What did the fact that he listened to “This American Life” or quoted “O Brother Where Art Thou?” or attempted to iron his clothes while wearing them say about his ability to be the President? Surely his handlers wouldn’t have wanted anyone hearing him call himself “the flipping Mormon” or noting, rather bitterly, that he may have been a “flawed candidate.” But there is not much utility in a retrospective gaffe; seen now, the documentary is more intriguing for its general tone, which is one of pathos and quiet regret. [...]

Meanwhile, the RNC struggle to expand and find unity within itself continues:

RNC showcased update, while losing image remains

The road ahead is looking rather long.

Saturday, January 18, 2014

Train Your Brain, with Lasting Results

Brain training helped older adults stay sharp for years -study
CHICAGO, Jan 13 (Reuters) - A brief course of brain
exercises helped older adults hold on to improvements in
reasoning skills and processing speed for 10 years after the
course ended, according to results from the largest study ever
done on cognitive training.

Older adults who underwent a brief course of brain exercises
saw improvements in reasoning skills and processing speed that
could be detected as long as 10 years after the course ended,
according to results from the largest study ever on cognitive


People in the study had an average age of 74 when they
started the training, which involved 10 to 12 sessions lasting
60 to 75 minutes each. After five years, researchers found,
those with the training performed better than their untrained
counterparts in all three measures.

Although gains in memory seen at the study's five-year mark
appeared to drop off over the next five years, gains in
reasoning ability and processing speed persisted 10 years after
the training.

"What we found was pretty astounding. Ten years after the
training, there was evidence the effects were durable for the
reasoning and the speed training," said George Rebok, an expert
on aging and a professor at Johns Hopkins University in
Baltimore, who led the study.

Participants in all three training groups also reported that
they had an easier time with daily activities such as managing
their medications, cooking meals or handling their finances than
did participants who did not get the training. But standard
tests of these activities showed no differences between the

"The speed-of-processing results are very encouraging," said
study co-author Jonathan King, program director for cognitive
aging in the Division of Behavioral and Social Research at the
National Institute on Aging (NIA), part of the National
Institutes of Health, which helped fund the research.

King said the self-reported improvements in daily function
were interesting, but added, "We do not yet know whether they
would truly allow older people to live independently longer."


The training course was designed to bolster specific
cognitive abilities that begin to slip as people age. It does
not aim to prevent dementia caused by underlying disease such as Alzheimer's.

At the start of the study, all 2,832 participants were
cognitively normal. The study included four groups: three
training groups plus a control group of volunteers who came in
for regular testing to see how they were faring with age.

People were trained in small groups over a period of several
weeks and then were tested immediately after the training and
again one, two, three, five and 10 years later.

About 60 percent of the volunteers who underwent training
also got booster training sessions, which enhanced the initial benefits.

At the end of the trial, all groups showed declines compared
with their initial baseline tests in memory, reasoning and
processing speed, but those who got training in reasoning and
processing speed experienced less decline.


The programs, developed by the researchers, were focused
largely on teaching strategies to improve cognitive performance.
For example, the memory training taught people how to remember
word lists, sequences and main ideas, while the reasoning
training focused on things like recognizing number patterns.

In the processing speed training, people were asked to focus
on the main object in a computer screen while also trying to
quickly recognize and identify objects on the periphery of the
screen. Such training can help older drivers with things like
recognizing road signs while driving.

A version of the speed training program developed for this
trial is now commercially available through the brain fitness
company Posit Science, but the researchers are working on makingother types of training available as well.

Rebok's team just got a grant from the National Institute on
Aging to make a computerized version of the memory test, with
the hope that repeated training can improve the results.

The study was not designed to explain why cognitive training
can have such a lasting effect. Rebok said it may be that people
take the strategies they learn and practice them over time. As
they age, trained individuals can rely on these strategies to
compensate for their declines. [...]
I find it interesting that 10 to 12 hours of training could have such long lasting effects. They must have learned techniques that they incorporated into their daily lives, that made a difference in the long run. An education about how to use their brains, or more effectively use their thinking and reasoning faculties. Why not teach these things in schools to people who are young? Such habits might benefit them throughout their entire lives.


Is "Human Trafficking" Unimportant to India?

From the Times of India:

Wayne’s world: Was expelled US official a bleeding heart or an ugly American?
WASHINGTON: The US official who was expelled in a tit-for-tat diplomatic battle over Devyani Khobragade was nearing the end of his posting in India, scheduled to leave New Delhi in February. But in their three years in India, Wayne May, who headed the US embassy's security team in New Delhi, and his wife Alicia Muller May, who worked as the embassy's community liaison officer, revealed conflicting impulses and contradictory outlook towards the people and country they served in.

On the one hand, it was evidently their bleeding heart concern for housekeeper Sangeeta Richard, whose in-laws worked with them and a succession of US embassy officials, that led them to "rescue" the nanny's husband and children from the strong-arm tactics of the Indian judicial and police system that diplomat Devyani Khobragade unleashed on them after Sangeeta fell out with her. On the other hand, their facetious comments about a stereotypical India abounding in chaos and filth, which some might see as offensive, shows them as the archetypal "ugly Americans".

They laid out their opinions and views quite guilelessly on social media through photographs and comments that were quickly seized on and distributed by bloggers and trolls ever sensitive to any perceived insult of India. Although the comments are often flippant, the kind many people make on social media without fear of consequence, they sound extremely offensive now given the fraught context of the diplomatic spat. Their profiles, pictures and comments were removed and their social media presence sanitized soon after they were discovered, but not before the online warriors had saved and uploaded them on other social media sites, portraying them as "racist American diplomats". [...]
You can read the rest of the article, to see the offensive facebook posts. They might have been insensitive, in the strictest sense, but they were also truthful. I think many Americans do find India to be a place of contradictions.

I found it interesting how the article kinda glosses over the "the strong-arm tactics of the Indian judicial and police system", and the way it puts the word "rescue" in quotes, and then proceeds to hype the facebook comments. But honestly, which is more serious: Comments on a facebook page, or Human Trafficking?

All the articles I've read in the Indian press, seem to completely ignore the accusations against Devyani Khobragade. Are they really so unimportant and irrelevant?

It's not like she and her family are squeaky clean. It seems there is some scandal in India, regarding politics and realestate.

I don't know if the accusations made against her in New York are true, but a trial would have revealed that, but she didn't stick around to defend herself. Was she mistreated? That would have been explored/exposed in a trial also, but she left. Was it because she didn't want the truth to be revealed? Perhaps she would have been exonerated from some charges, but not others, and chose not to risk that?

I can't help but wonder if this really is more about something going on between the India and USA governments, some sort of power play, and this incident is just a symptom of something larger that we're not hearing about? Is there any foundation to the charges against Khobragade? Why, or why not? Real journalists might ask questions like these, but there don't seem to be any anymore, anywhere. Instead, we get the tit-for-tat stuff, because it sells newspapers, I guess. It seems their newspapers are just as rubbishy as ours. More hype than content.

Update 01/15/14: Also see:

Timeline: The case of Devyani Khobragade and Sangeeta Richard
A timeline of the facts?

Claim Against Indian Diplomat Has Echoes of Previous Cases
NYC unionized workers take the side of the maid. At the end of the article, local Indian merchants in NYC are quoted, saying the maid should have been grateful, because she would have been treated much worse in India.

Since she claims she was forced to work from 6am to llpm everyday, without being paid, with only two hours off on Sunday to go to church, I guess that "Worse in India" must be really, really bad.

Devyani Khobragade incident
Wikipedia provides it's version of the facts. Which seems more or less what I've read elsewhere.

The Grace Commission: Good Advice Ignored

I've often heard the Grace Commission mentioned in various articles, so decided to look it up. From Wikipedia:

The Grace Commission
The Private Sector Survey on Cost Control (PSSCC), commonly referred to as The Grace Commission, was an investigation requested by United States President Ronald Reagan, in 1982. The focus of it was waste and inefficiency in the US Federal government. Its head, businessman J. Peter Grace,[1] asked the members of that commission to "be bold" and "work like tireless bloodhounds. Don't leave any stone unturned in your search to root out inefficiency."[2]

The report
The Grace Commission Report[3] was presented to Congress in January 1984. The report claimed that if its recommendations were followed, $424 billion could be saved in three years, rising to $1.9 trillion per year by the year 2000. It estimated that the national debt, without these reforms, would rise to $13 trillion by the year 2000, while with the reforms they projected it would rise to only $2.5 trillion.[4] Congress ignored the commission's report. The debt reached $5.8 trillion in the year 2000.[5][6] The national debt reached 13 trillion after the subprime mortgage-collateralized debt obligation crisis in 2008.

The report said that one-third of all income taxes are consumed by waste and inefficiency in the federal government, and another one-third escapes collection owing to the underground economy. “With two thirds of everyone’s personal income taxes wasted or not collected, 100 percent of what is collected is absorbed solely by interest on the federal debt and by federal government contributions to transfer payments. In other words, all individual income tax revenues are gone before one nickel is spent on the services [that] taxpayers expect from their government."[4]
Congress was warned. They had the chance to do something about it, and did nothing. We The People, let them do it. Now we are living the consequences.

Mr. Grace, a Democrat Businessman, was an interesting fellow:

J. Peter Grace
[...] In the Kennedy administration, J. Peter Grace was head of the Commerce Department Committee on the Alliance for Progress.[5] President Reagan, in announcing the selection of J. Peter Grace to lead The Grace Commission on waste and inefficiency in the Federal government, said:

We have a problem that's been 40 years in the making, and we have to find ways to solve it. And I didn't want to ruin your appetites, so I waited till now to tell you this, but during the hour we're together here eating and talking, the Government has spent $83 million. And by the way, that includes the price of your lunch. [Laughter] Milton Friedman is right. There really is no such thing as a free lunch. The interest on our debt for the last hour was about $10 million of that.

In selecting your Committee, we didn't care whether you were Democrats or Republicans. Starting with Peter Grace, we just wanted to get the very best people we could find, and I think we were successful.

I'll repeat to you today what I said a week ago when I announced Peter's appointment: Be bold. We want your team to work like tireless bloodhounds. Don't leave any stone unturned in your search to root out inefficiency.[6]

Mr. Grace, a Democrat, was asked what he would say to the campaign theme of Walter Mondale, the 1984 Democratic Presidential candidate, that higher taxes would be required to ease the deficit regardless of who wins the November election.

"I'd tell him he's nuts," Grace said. "He's wrong. He's wrong."[7] [...]

Monday, January 13, 2014

Lifelong Learning, and Resiliency

Two topics I've been interested in, and two articles about them on one of my favorite websites:

How and Why to Become a Lifelong Learner
For the first twenty-two years or so of our lives, our main “job” is learning. The bulk of our time is spent in classrooms acquiring new knowledge. And then, once we graduate, we feel like the education phase of our lives is done and now it’s time to go out into the world. Have you ever thought about how odd that idea is? That only a quarter of our lives should be devoted to learning, and then we should simply rest on our laurels for the remaining three-quarters of it?

It’s an erroneous idea – but one many have absorbed, at least subconsciously. But school need not be your exclusive provider of learning. Just because you’ve finished your formal education, doesn’t mean that your education is over!

Many, perhaps most, of history’s greatest men were autodidacts – those who devote themselves to self-education, either in addition to or as a substitute to formal schooling. [...]
It goes on to give examples, and more. Lots of good links, I thoroughly enjoyed it. One of the links was this:

Building Your Resiliency: Part V – Recognizing and Utilizing Your Signature Strengths
This is the fifth part in a series designed to help you boost your resiliency. For the previous entries, see Part I, Part II, Part III, and Part IV.

When we first introduced the topic of resiliency, we discussed how it is both a reactive and an active quality, a skill that helps you bounce back and reach out.

Today’s discussion will center on the active aspect of resiliency and the path to gaining the confidence to take risks and embrace change.

Anchoring Your Resiliency in Your Authentic Self

When your self-esteem and sense of self-worth is tied to other people, your job, or any other external factors, your confidence is subject to every wind of change and lacks real stability. Any time these external factors change, your happiness and confidence go with it. Your emotional fortitude goes up and down like a roller coaster.

Tying your self-concept to external factors also keeps you from embracing adventure and approaching the world like a courageous explorer. If you base your self-concept on external things, any changes in those things will throw you for a loop, create anxiety, and compel you to cling as tightly as you can to the status quo. You become desperate to keep your life just the way it is and can’t handle change. You avoid traveling, moving, changing jobs, and getting into relationships because these steps alter the environment on which you’ve based your self-concept, leaving you feeling lost and out of control

The key to active resiliency is to build your self-concept not on a constructed self, but on an authentic self, not on external things, but on the inner, personal strengths that make you unique as a man. Your unique strengths are your special tools that will allow you to build a happy and fulfilling life. Understanding what tools you possess can give you the confidence that you’ll be able to face any challenge that comes your way. While we can’t predict the future, we can have confidence in our ability to deal with whatever happens. [...]
Again, it's full of interesting links. And this article is part of a series, so I can look forward to reading them all.

History of alcohol in Colonial America

It was a topic that came up at dinner, so I looked it up and found this on Wikipedia:

History of alcoholic beverages
[...] Colonial America
Further information: Christianity and alcohol
Interior view of the Toll Gate Saloon in Black Hawk, Colorado (1897).

Alcoholic beverages played an important role in Colonial America from the very beginning. The Mayflower brought more beer than water as it departed for the New World. While this may seem strange viewed from the modern context, it should be understood that drinking wine and beer at that time was safer than water - which was usually taken from sources used to dispose of sewerage and garbage. Their experience showed them that it was safer to drink alcohol than the typically polluted water in Europe. Alcohol was also an effective analgesic, provided energy necessary for hard work, and generally enhanced the quality of life.

For hundreds of years their English ancestors had consumed beer and ale. Both in England and in the New World, people of both sexes and all ages typically drank beer with their meals. Because importing a continuing supply of beer was expensive, the early settlers brewed their own. However, it was difficult to make the beer they were accustomed to because wild yeasts caused problems in fermentation and resulted in a bitter, unappetizing brew. Although wild hops grew in New England, hop seeds were ordered from England in order to cultivate an adequate supply for traditional beer. In the meantime, the colonists improvised a beer made from red and black spruce twigs boiled in water, as well as a ginger beer.
A Depression-era bar in Melrose, Louisiana.

Beer was designated X, XX, or XXX according to its alcohol content. The colonists also learned to make a wide variety of wine from fruits. They additionally made wine from such products as flowers, herbs, and even oak leaves. Early on, French vine-growers were brought to the New World to teach settlers how to cultivate grapes.
J.W. Swarts Saloon in Charleston, Arizona in 1885

Colonists adhered to the traditional belief that distilled spirits were aqua vitae, or water of life. However, rum was not commonly available until after 1650, when it was imported from the Caribbean. The cost of rum dropped after the colonists began importing molasses and cane sugar directly and distilled their own. By 1657, a rum distillery was operating in Boston. It was highly successful and within a generation the production of rum became colonial New England's largest and most prosperous industry.

Almost every important town from Massachusetts to the Carolinas had a rum distillery to meet the local demand, which had increased dramatically. Rum was often enjoyed in mixed drinks, including flip. This was a popular winter beverage made of rum and beer sweetened with sugar and warmed by plunging a red-hot fireplace poker into the serving mug. Alcohol was viewed positively while its abuse was condemned. Increase Mather (d. 1723) expressed the common view in a sermon against drunkenness: "Drink is in itself a good creature of God, and to be received with thankfulness, but the abuse of drink is from Satan; the wine is from God, but the drunkard is from the Devil."

In the early 19th century, Americans had inherited a hearty drinking tradition. Many types of alcohol were consumed. One reason for this heavy drinking was attributed to an overabundance of corn on the western frontier, which encouraged the widespread production of cheap whiskey. It was at this time that alcohol became an important part of the American diet. In the 1820s, Americans drank seven gallons of alcohol per person annually.[20][21]

During the 19th century, Americans drank alcohol in two distinctive ways. One way was to drink small amounts daily and regularly, usually at home or alone. The other way consisted of communal binges. Groups of people would gather in a public place for elections, court sessions, militia musters, holiday celebrations, or neighborly festivities. Participants would typically drink until they became intoxicated. [...]
Follow the link for the many embedded links, photos and footnotes.

"Forbes" Reviews Detective Show

I saw this on the Forbes website:

How HBO's 'True Detective' Will Change The Way You Watch Television
HBO’s new Sunday night drama True Detective is really, really good. It’s also potentially revolutionary.

The moody crime drama starring Woody Harrelson and Matthew McConaughey examines a grisly Louisiana murder through the eyes of the detectives who handled the case. Some reviewers are comparing it to The Wire–which makes most short lists for the Greatest Series Ever–although a more apt comparison may be an earlier David Simon work, the more stylized Homicide: Life on The Street. In any case, this show is well worth watching (and worth discussing, as we’ll do in weeks to come.) [...]
It sounds like it could be good. But I don't get HBO. Maybe it will come out on Netflix eventually?

Sunday, January 05, 2014

The OLPC tablet, with Android

I've blogged previously about The OLPC Project. This tablet looks like it's made by them, but it isn't. It's made by Vivitar, though it looks like OLPC authorized or commissioned it:
XO 7-inch Kids Tablet XO-780
[...] Tablets are always a family device shared by children, parents and even grandparents. The XO Tablet is the perfect solution: a full-fledged Android tablet with Google Play for older users and the educational tablet with its own app store for children. Press one icon and the tablet changes to match the user. Parents set controls to limit child access to the Android version.


The XO Tablet is a continuation of OLPC's mission to put education and learning--through connected devices--into the hands of children around the world. OLPC's proceeds from XO Tablet purchases will be used to further develop the XO Learning software, to enable region-specific enhancements and customizations and to get connected technology to a larger population of children. OLPC, as a non profit organization, donates software, tablets, and teacher training to schools with underprivileged children in the U.S. and around the world. [...]
The reviews are mostly positive. And interestingly enough, adults seem to find it equally useful. Follow the link, there's lots more information. I've been looking at tablets, and for the price, this one seems pretty good.

When was Buddha Born?

Which is to say, what century:

How two archaeologists’ hunch led to stunning claim about Buddha’s birth date
The two archaeologists had a hunch that the Buddha’s birthplace in southern Nepal held secrets that could transform how the world understood the emergence and spread of Buddhism.

Their pursuit would eventually see them excavate the sacred site of Lumbini as monks prayed nearby, leading to the stunning claim that the Buddha was born in the sixth century BC, two centuries earlier than thought.


“The Eureka moment came in 2011, when we came across a brick temple located below the existing Asokan temple, and below that a sort of void.

“It became clear then that there was much more to this excavation.”

Over the next two years, archaeologists, geophysicists and hired workmen from Nepal and Britain worked on the site, digging in the presence of meditating monks and nuns.

“It was a very moving, very special experience to dig for traces while pilgrims prayed and paid homage,” Acharya said.

They dug for a few weeks each year and sent the samples to laboratories for analysis.

Radiocarbon and optically stimulated luminescence techniques were used to date fragments of charcoal and grains of sand found at the site.

The archaeologists also found holes, apparently meant to secure posts, in the open void below the brick temple.

“The intact holes suggested that whoever had built the brick temple had taken care not to damage the ancient structure below, suggesting the site was always considered holy,” Coningham said.

Lab tests confirmed the existence of roots within the void below the brick structure, suggesting it may have been a shrine where a tree once grew, possibly the hardwood sal tree under which many believe the Buddha was born.

The discovery, revealed in November, sparked huge excitement, but some historians have urged caution, saying the ancient tree shrine could have been built by pre-Buddhist believers.

“The worship of trees, often at simple altars, was a ubiquitous feature of ancient Indian religions,” Julia Shaw, a lecturer in South Asian archaeology at University College London told National Geographic’s online edition.

“It is also possible that what is being described represents an older tree shrine quite disconnected from the worship of the historical Buddha,” Shaw added.

According to Coningham, however, if the Buddhists had appropriated the tree shrine from non-Buddhists, the site would not have survived relatively unscathed.

“Also, the inscriptions at Bodhgaya (where the Buddha achieved enlightenment) reveal a thriving culture of tree worship, which suggests continuity,” he added. [...]
The article goes on to say that what they are finding would seem to match what the local people believe, that Buddha was born around 623 BC.