Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Tuesday, January 14, 2020

Windows 7 support ends. So where to now?

Microsoft suggests upgrading to windows 10. That would be fine... if it worked. They offered free windows 10 upgrades. I tried that, and it was disastrous. It seemed to work well at first, but as time went on, updates would cause different parts or functions of the the computer (like SOUND) to stop working. Turns out, that unless your computer hardware -all of it- has been "Windows 10 certified", Microsoft does not guarantee that it will work on YOUR computer. Wish I knew that before I installed it. By the time I discovered this, it was too late to roll it back from Windows 10 to Windows 7.

So if you want to "upgrade" to Windows 10, you are probably better off getting a computer with it already installed and certified for that hardware. Then, the Windows 10 fun can begin. It has some good features. Yet, some things never change:


But... what should you then DO with your old Windows 7 machine? You can keep using it for a while longer of course, but as time goes on, without security updates, it will become riskier and riskier to use.

Personally, I found a solution with my aborted Windows 10 computer, that couldn't be rolled back to Windows 7. I'm using it with all my Windows 7 machines now. The solution is a Linux operating system called Linux Mint. It's a complete, free opensource operating system that you can download and install, free of charge.


There are several versions you can choose from. I prefer the Linux Mint Debian Edition (LMDE), because it's a "rolling" distribution; you only have to install it once, then it updates itself continuously after that. Other versions use Ubuntu as a base, and major upgrades require a complete reinstall every three to five years.

It's probably the easiest Linux system for a novice to download and use, and easy to learn and use too. A perfect way to extend the life and usefulness of older computers that cannot be successfully upgraded to Windows 10. Highly recommended.
   

Monday, January 18, 2016

The Push to upgrade to Windows 10

It continues:

Microsoft Makes Windows 7 And Windows 8 Support Worse
Think your copy of Windows 7 is supported until 2020? Think your copy of Windows 8 is supported until 2023? You might want to think again because Microsoft MSFT -6.00% has just announced radical changes to how it will treat users of both operating systems…

Talking on its Windows Blog, Microsoft has announced it will now stop support for installations of Windows 7 or Windows 8 if they are on new or upgraded computers running the latest chips from Intel INTC -10.34%, AMD or Qualcomm QCOM -4.44%. Specifically these are listed as ‘Kaby Lake’ (Intel), ‘Bristol Ridge’ (AMD) and Qualcomm’s ‘8996’ (the base for the Snapdragon 820). Between them these chips will dominate sales of all new desktops, laptops, hybrids and tablets in 2016.

In fact Microsoft is going even further than this by also refusing to support Windows 7 and Windows 8 on Intel’s current generation ‘Skylake’ processors, with the exception of a “list of specific new Skylake devices”. This list includes the Dell Latitude 12 and XPS 13; HP EliteBook Folio and G3 and Lenovo ThinkPad T460s and X1 Carbon. Even then support on those devices will only last 18 months ending on 17 July, 2017.

Yes, you read this right: Microsoft is breaking from 31 years of Windows history by refusing to honour its promised Windows lifecycles unless users stick to old hardware. Upgrade your existing Windows 7 or Windows 8 computer to these chipsets or buy new hardware and install Windows 7 or Windows 8 on it and the official Windows Lifecycle dates don’t mean a thing.

All of which begs the question…

Why Is Microsoft Doing This? [...]
Read the whole thing for embedded links, and the links at the end to related articles. I've posted previously about Microsoft plans to
force the Windows 10 upgrade. This also is pressure in that direction.

I've been using Windows 10 on one of my machines. It's not absolutely horrible, and even has some nice features. It is thus far proving to be about 95% stable. Unfortunately, the unstable 5% can kick in when I'm trying to get serious work done. I find such unreliability intolerable to try and run a business with.

I need a RELIABLE computer platform to run business software like QuickBooks. If Windows 10 does not improve it's stability, I will most likely migrate to Apple, because it's a mainstream OS that can provide that stability. At least I hope it is. Can anyone tell me differently? No OS is without some problems, but a certain degree of stability is necessary for business. I use a computer to get work done, not so I can work on the computer to try to get it to work.
     

Tuesday, January 05, 2016

"Creepy" Robot Receptionist?

Yeah, kinda. Sorta. In a way. Or not. What do you think?:
Does this “humanlike” robot receptionist make you feel welcome or creeped out?
From a distance, Nadine looks like a very normal middle-aged woman, with a sensible haircut and dress style, and who’s probably all caught up on Downton Abbey. But then you hear Nadine talk and move, and you notice something’s a bit off. Nadine is actually the construct of Nadia Thalmann, the director of the Institute for Media Innovation at Nanyang Technological University in Singapore. She’s a robot that’s meant to serve as a receptionist for the university.
Thalmann modeled the robot after herself, and said that, in the future, robots like Nadine will be commonplace, acting like physical manifestations of digital assistants like Apple’s Siri or Microsoft’s Cortana. “This is somewhat like a real companion that is always with you and conscious of what is happening,” Thalmann said in a release.

Nadine can hold a conversation with real humans, and will remember someone’s face the next time she sees him. She can even remember what she spoke about with the person the last time they met. NTU said in its release that Nadine’s mood will depend on the conversations she’s having with others, much like a human’s mood can change. There’s no word on what she’d do in a bad mood, though—hopefully she won’t be able to close pod bay doors, or commit murder. Perhaps when the robot uprising happens, we won’t even see it coming, as they’ll all look just like us. [...]
The article goes on to talk about how the evolution of these robots is likely to continue, as they get better and even become commonplace. Read the whole thing for photos, video, and many embedded links. Do watch the video, it's short. I have to admit it's the most life-like robot I've ever seen.

I said it was "kinda" creepy because it looks so life-like, yet is not alive, and I'm not used to that. Talking to "life-like" things. But I suppose if it becomes commonplace, one would get used to it as normal. But more than "kinda creepy", it's ... pretty darn kewl! Commander Data, here we come...

Here is another link to a similar robot by another scientist:

The highest-paid woman in America is working on robot clones and pigs with human DNA
[...] Rothblatt also explained how she hired a team of robotic scientists to create a robot that was a “mind clone” of her wife, Bina Aspen.

Starting with a “mindfile”—a digital database of a person’s mannerisms, personality, recollections, feelings, beliefs, attitudes, and values gleaned from social media, email, videos, and other sources—Rothblatt’s team created a robot that can converse, write Tweets, and even express human emotions such as jealousy and pain in ways that mimic the person she was modeled after.

When Bina’s mortal self dies, Rothblatt said the robot version of her wife will live on, making it possible for “our identity to begin to transcend our bodies.”

It sounds like science fiction until you see photos of the robot, see her tweet, and hear snippets from her conversations that made audience members gasp and chuckle nervously as they realized Rothblatt was talking about more than just an idea. [...]
Read the whole thing for embedded links and more. And get ready for the Brave New World. It's closer than you think.
     

Saturday, December 12, 2015

Elon Musk, on OpenAI: “if you’re going to summon anything, make sure it’s good.”

I agree. Will these guys lead the way?

Elon Musk and Other Tech Titans Create Company to Develop Artificial Intelligence
[...] The group’s backers have committed “significant” amounts of money to funding the project, Musk said in an interview. “Think of it as at least a billion.”

In recent years the field of artificial intelligence has shifted from being an obscure, dead-end backwater of computer science to one of the defining technologies of the time. Faster computers, the availability of large data sets, and corporate sponsorship have developed the technology to a point where it powers Google’s web search systems, helps Facebook Inc. understand pictures, lets Tesla’s cars drive themselves autonomously on highways, and allowed IBM to beat expert humans at the game show “Jeopardy!”

That development has caused as much trepidation as it has optimism. Musk, in autumn 2014, described the development of AI as being like “summoning the demon.” With OpenAI, Musk said the idea is: “if you’re going to summon anything, make sure it’s good.”

Brighter Future

“The goal of OpenAI is really somewhat straightforward, it’s what set of actions can we take that increase the probability of the future being better,” Musk said. “We certainly don’t want to have any negative surprises on this front.” [...]
I did a post about that comment of his a while back:

The evolution of AI (Artificial Intelligence)

Nice to see that those who were making the warnings, are also actively working to steer the development in positive directions and trying to avoid unforeseen consequences.

I still think real AI is a long way off. But it isn't too soon to start looking ahead, to anticipate and remedy problems before they even occur.
     

Wednesday, December 02, 2015

Oh no, what have I done?

In a weak moment, whilst perusing the Black Friday offerings on Amazon.com, I ordered one:



Amazon Echo
Amazon Echo is designed around your voice. It's hands-free and always on. With seven microphones and beam-forming technology, Echo can hear you from across the room—even while music is playing. Echo is also an expertly tuned speaker that can fill any room with immersive sound.

Echo connects to Alexa, a cloud-based voice service, to provide information, answer questions, play music, read the news, check sports scores or the weather, and more—instantly. All you have to do is ask. Echo begins working as soon as it detects the wake word. You can pick Alexa or Amazon as your wake word. [...]
The features listed with the photo are only a few of the key features. Follow the link for more info, embedded videos, reviews, FAQ and more.

It, "Alexa", arrives tomorrow. I wonder if it will be anything like HAL from the movie 2001: A Space Odyssey? That would be kinda cool, I guess. As long as she isn't the Beta version that murders you while you sleep.

UPDATE 12-08-15: So far, so good. It does everything they said it would. Only complaint, it can't attach to external speakers (but I knew that before I bought it.) It was very easy to set up, it's very easy to use. The voice recognition is really excellent. I can play radio stations from all over the world. When I want info about a song or music, I can ask Alexa, and she will tell me.

There are more features available if I sign up for Amazon Prime ($100 per year, which works out to $8.50 a month). I'm thinking about it.
     

Tuesday, November 03, 2015

Is Windows 10 the new software "Borg"?

Borg, as in "resistance is futile":

Microsoft Makes Windows 10 Upgrades Automatic For Windows 7 And Windows 8
[...] In September Microsoft admitted it is downloading Windows 10 on every Windows 7 and Windows 8 computer. Then in October it claimed an ‘accident’ saw these downloads begin installing without user permission. Well this accident now looks to have been a secret test run because Microsoft has confirmed mass upgrades to Windows 10 from all Windows 7 and Windows 8 computers are about to begin…

In a post to the official Windows blog, Windows and Devices Group executive vice president Terry Myerson announced this will be a two step process:

Step One

Beginning now, Windows 10 has been reclassified as an “Optional” update in Windows Update for Windows 7 and Windows 8 computers. This means users who have set their version of Windows to accept all updates will find the Windows 10 installation process will begin automatically and they will need to actively cancel it.

[...]

Step Two

But in “early” 2016 things will become more aggressive and Microsoft will again reclassify Windows 10 as a “Recommended” update. Given the default setting on Windows 7 and Windows 8 is for all Recommended updates to install automatically this means the vast majority of users will find the Windows 10 install process starts up on their machines.

“Depending upon your Windows Update settings, this may cause the upgrade process to automatically initiate on your device,” admits Myerson.

[...]

For Most, Resistance Is Now Futile

While tech savvy users will find workarounds and hacks, quite frankly avoiding the upgrade process is going to become far too much effort for the average consumer.

Is Windows 10 worth upgrading? From the perspective of most mainstream consumers, I’d say yes. It’s slicker than Windows 7 and more intuitive than Windows 8. But it is also incredibly invasive and controlling, taking an iron grip on what it installs to your PC and tracking everything you do – something options let you minimise, but not stop entirely.

As such my personal objection to Microsoft’s behaviour is not that Windows 10 doesn’t represent a potentially valuable upgrade, it is that the company has forgotten the fundamental right of customers to choose. And dressing ‘choice’ up as ‘you can just keep saying No’ is a facade everyone should see through…
I had blocked it in my updates, but it keeps unblocking itself and adding itself back. This is really pushy, and I resent it.

It just isn't right, because in the end, you have to ask "Whose computer is this, mine or Microsoft's?" I bought it with Windows 7, because that is what I wanted. Offering a free upgrade path to 10 is fine, but I want the freedom to choose it. When I want. If I want. When I decide that I'm ready for it.

Do I actually have to seriously consider moving to a Mac, as my only option? Or moving to Linux Mint on my Windows 7 computer, before it "turns"?
     

Sunday, November 01, 2015

Writing computer code: not for everyone?

Not only not for everyone, but not for most people:

Coding Academies Are Nonsense
[...] I see coding shrinking as a widespread profession. Not because software is going away, but because the way we build software will fundamentally change. Technology for software creation without code is already edging toward mainstream use. Visual content creation tools such as Scratch, DWNLD and Telerik will continue to improve until all functionality required to build apps is available to consumers — without having to write a line of code.

Who needs to code when you can use visual building blocks or even plain English to describe intent? Advances in natural-language processing and conceptual modeling will remove the need for traditional coding from app development. Software development tools will soon understand what you mean versus what you say. Even small advances in disambiguating intent will pay huge dividends. The seeds are already planted, from the OpenCog project to NLTK natural-language processing to MIT’s proof that you can order around a computer in your human language instead of code.

Academies had better gather those revenues while they can, because ultimately they are the product of short-term thinking. Coding skills will continue to be in high demand until technology for software creation without code disrupts the entire party, crowding out programming as a viable profession. [...]
Kinda what I suspected. The technology is changing quickly, and whats valid today is obsolete tomorrow. I think eventually there will be software that can create code. There were also some interesting comments about people who try to learn computer coding, and why they give it up. If you need more convincing, read the whole thing for further arguments, embedded links and more.
     

Sunday, February 08, 2015

What do Stephen Hawking, Elon Musk and Bill Gates all have in common?

They are concerned about the dangers posed by artificial intelligence:

Stephen Hawking warns artificial intelligence could end mankind
[...] He told the BBC:"The development of full artificial intelligence could spell the end of the human race."

His warning came in response to a question about a revamp of the technology he uses to communicate, which involves a basic form of AI.

But others are less gloomy about AI's prospects.

The theoretical physicist, who has the motor neurone disease amyotrophic lateral sclerosis (ALS), is using a new system developed by Intel to speak.

Machine learning experts from the British company Swiftkey were also involved in its creation. Their technology, already employed as a smartphone keyboard app, learns how the professor thinks and suggests the words he might want to use next.

Prof Hawking says the primitive forms of artificial intelligence developed so far have already proved very useful, but he fears the consequences of creating something that can match or surpass humans.

"It would take off on its own, and re-design itself at an ever increasing rate," he said.

"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded." [...]

Elon Musk Thinks Sci-Fi Nightmare Scenarios About Artificial Intelligence Could Really Happen
[...] Musk, who called for some regulatory oversight of AI to ensure "we don't do something very foolish," warned of the dangers.

"If I were to guess what our biggest existential threat is, it’s probably that. So we need to be very careful with the artificial intelligence," he said. "With artificial intelligence we are summoning the demon."

Artificial intelligence (AI) is an area of research with the goal of creating intelligent machines which can reason, problem-solve, and think like, or better than, human beings can. While many researchers wish to ensure AI has a positive impact, a nightmare scenario has played out often in science fiction books and movies — from 2001 to Terminator to Blade Runner — where intelligent computers or machines end up turning on their human creators.

"In all those stories where there’s the guy with the pentagram and the holy water, it’s like yeah he’s sure he can control the demon. Didn’t work out," Musk said. [...]

Bill Gates: Elon Musk Is Right, We Should All Be Scared Of Artificial Intelligence Wiping Out Humanity
Like Elon Musk and Stephen Hawking, Bill Gates thinks we should be concerned about the future of artificial intelligence.

In his most recent Ask Me Anything thread on Reddit, Gates was asked whether or not we should be threatened by machine super intelligence.

Although Gates doesn't think it will bring trouble in the near future, that could all change in a few decades. Here's Gates' full reply:

I am in the camp that is concerned about super intelligence. First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don't understand why some people are not concerned.

Google CEO Larry Page has also previously talked on the subject, but didn't seem to express any explicit fear or concern.

"You can't wish these things away from happening," Page said to The Financial Times when asked about whether or not computers would take over more jobs in the future as they become more intelligent. But, he added that this could be a positive aspect for our economy.

At the MIT Aeronautics and Astronautics' Centennial Symposium in October, Musk called artificial intelligence our "biggest existential threat."

Louis Del Monte, a physicist and entrepreneur, believes that machines could eventually surpass humans and become the most dominant species since there's no legislation regarding how much intelligence a machine can have. Stephen Hawking has shared a similar view, writing that machines could eventually "outsmart financial markets" and "out-invent human researchers."

At the same time, Microsoft Research's chief Eric Horvitz just told the BBC that he believes AI systems could achieve consciousness, but it won't pose a threat to humans. He also added that more than a quarter of Microsoft Research's attention and resources are focused on artificial intelligence.
They all seem to agree that any threat is not immediate, and probably far off in the future. So far as I can see, machines so far merely mimic intelligence. They certainly have no consciousness.

I found the remark by the Microsoft researcher interesting, that he believes that "AI systems could achieve consciousness". I don't see how that could be possible, which is what makes the remark... interesting. It's interesting too, that Microsoft is focusing such a large percentage of it's attention and resources on AI. What would an "artificial consciousness" created by Microsoft be like? Hopefully, nothing like Windows 98. ;-)

Read the original complete articles, for embedded links and more.
     

Monday, January 12, 2015

Skype, with a speech translator?

Supposedly. This was announced last month:



Skype Will Begin Translating Your Speech Today
¿Cómo estás?

Voice over IP communication is entering a new era, one that will hopefully help break down language barriers. Or so that's the plan. Using innovations from Microsoft Research, the first phase of the Skype Translator preview program is kicking off today with two spoken languages -- Spanish and English. It will also feature over 40 instant messaging languages for Skype customers who have signed up via the Skype Translator sign-up page and are using Windows 8.1.

It also works on preview copies of Windows 10. What it does is translate voice input from someone speaking English or Spanish into text or voice. The technology relies on machine learning, so the more it gets used, the better it will be at translating audio and text.

"This is just the beginning of a journey that will transform the way we communicate with people around the world. Our long-term goal for speech translation is to translate as many languages as possible on as many platforms as possible and deliver the best Skype Translator experience on each individual platform for our more than 300 million connected users," Skype stated in a blog post.

Translations occur in "near real-time," Microsoft says. In addition, there's an on-screen transcript of your call. Given the many nuances of various languages and the pace at which communication changes, this is a pretty remarkable feat that Microsoft's attempting to pull off. There's ton of upside as well, from the business world to use in classrooms.

If you want to test it out yourself -- and Microsoft hopes you do, as it's looking for feedback at this early stage -- you can register for the program by going here.
Follow the link to the original article for embedded links, and a video.

See how it works here:

Skype Translator is the most futuristic thing I’ve ever used
We have become blasé about technology.

The modern smartphone, for example, is in so many ways a remarkable feat of engineering: computing power that not so long ago would have cost millions of dollars and filled entire rooms is now available to fit in your hand for a few hundred bucks. But smartphones are so widespread and normal that they no longer have the power to astonish us. Of course they're tremendously powerful pocket computers. So what?

This phenomenon is perhaps even more acute for those of us who work in the field in some capacity. A steady stream of new gadgets and gizmos passes across our desks, we get briefed and pitched all manner of new "cutting edge" pieces of hardware and software, and they all start to seem a little bit the same and a little bit boring.

Even news that really might be the start of something remarkable, such as HP's plans to launch a computer using memristors for both longterm and working memory and silicon photonics interconnects, is viewed with a kind of weary cynicism. Yes, it might usher in a new generation of revolutionary products. But it probably won't.

But this week I've been using the preview version of Microsoft's Skype Translator. And it's breathtaking. It's like science fiction has come to life.

The experience wasn't always easy; this is preview software, and as luck would have it, my initial attempts to use it to talk to a colleague failed due to hitherto undiscovered bugs, so in the end, I had to talk to a Microsoft-supplied consultant living in Barranquilla, Colombia. But when we got the issues ironed out and made the thing work, it was magical. This thing really works. [...]
Follow the link for more, and enlargeable photos that shows what it looks like as it's working.
     

Saturday, August 23, 2014

USB Devices and Malware Attacks

New Flaws in USB Devices Let Attackers Install Malware: Black Hat
[...] In a blog post providing more insight into the talk, Nohl and Lell reveal that the root trigger for their USB exploitation technique is by abusing and reprogramming the USB controller chips, which are used to define the device type. USB is widely used for all manner of computer peripherals as well as in storage devices. The researchers alleged that the USB controller chips in most common flash drives have no protection against reprogramming.

"Once reprogrammed, benign devices can turn malicious in many ways," the researchers stated.

Some examples they provide include having an arbitrary USB device pretend to be a keyboard and then issue commands with the same privileges as the logged-in user. The researchers contend that detecting the malicious USB is hard and malware scanner similarly won't detect the issue.

I'm not surprised, and no one else should be, either. After all, this isn't the first time researchers at a Black Hat USA security conference demonstrated how USB can be used to exploit users.

Last year, at the Black Hat USA 2013 event, security researchers demonstrated the MACTANS attack against iOS devices. With MACTANS, an Apple iOS user simply plugs in a USB plug in order to infect Apple devices. Apple has since patched that flaw.

In the MACTANS case, USB was simply used as the transport cable for the malware, but the point is the same. Anything you plug into a device, whether it's a USB charger, keyboard or thumb drive has the potential to do something malicious. A USB thumb drive is widely speculated to be the way that the Stuxnet virus attacked Iran's nuclear centrifuges back in 2010. The U.S. National Security Agency (NSA) allegedly has similar USB exploitation capabilities in its catalog of exploits, leaked by whistleblower Edward Snowden.

While the Security Research Labs researchers claim there are few defenses, the truth is somewhat different.

A reprogrammed USB device can have certain privileges that give it access to do things it should not be able to do, but the bottom line is about trust. On a typical Windows system, USB devices are driven by drivers that are more often than not signed by software vendors. If a warning pops up on a user's screen to install a driver, or that an unsigned driver is present, that should be a cause for concern.

As a matter of best practice, don't plug unknown USB devices into your computing equipment. It's just common sense, much like users should not open attachments that look suspicious or click on unknown links. The BadUSB research at this year's Black Hat USA conference is not as much a wake-up call for USB security as it is a reminder of risks that have been known for years

     

Thursday, July 31, 2014

The evolution of AI (Artificial Intelligence)

I've posted previously about how slow it will be, that we won't have something approaching human intelligence anytime soon. But, eventually, as AI evolves, it could start working on itself, and then start advancing very quickly:


How Artificial Superintelligence Will Give Birth To Itself
There's a saying among futurists that a human-equivalent artificial intelligence will be our last invention. After that, AIs will be capable of designing virtually anything on their own — including themselves. Here's how a recursively self-improving AI could transform itself into a superintelligent machine.

When it comes to understanding the potential for artificial intelligence, it's critical to understand that an AI might eventually be able to modify itself, and that these modifications could allow it to increase its intelligence extremely fast.

Passing a Critical Threshold

Once sophisticated enough, an AI will be able to engage in what's called "recursive self-improvement." As an AI becomes smarter and more capable, it will subsequently become better at the task of developing its internal cognitive functions. In turn, these modifications will kickstart a cascading series of improvements, each one making the AI smarter at the task of improving itself. It's an advantage that we biological humans simply don't have.

When it comes to the speed of these improvements, Yudkowsky says its important to not confuse the current speed of AI research with the speed of a real AI once built. Those are two very different things. What's more, there's no reason to believe that an AI won't show a sudden huge leap in intelligence, resulting in an ensuing "intelligence explosion" (a better term for the Singularity). He draws an analogy to the expansion of the human brain and prefrontal cortex — a key threshold in intelligence that allowed us to make a profound evolutionary leap in real-world effectiveness; "we went from caves to skyscrapers in the blink of an evolutionary eye."

The Path to Self-Modifying AI

Code that's capable of altering its own instructions while it's still executing has been around for a while. Typically, it's done to reduce the instruction path length and improve performance, or to simply reduce repetitively similar code. But for all intents and purposes, there are no self-aware, self-improving AI systems today.

But as Our Final Invention author James Barrat told me, we do have software that can write software.

"Genetic programming is a machine-learning technique that harnesses the power of natural selection to find answers to problems it would take humans a long time, even years, to solve," he told io9. "It's also used to write innovative, high-powered software."

For example, Primary Objects has embarked on a project that uses simple artificial intelligence to write programs. The developers are using genetic algorithms imbued with self-modifying, self-improving code and the minimalist (but Turing-complete) brainfuck programming language. They've chosen this language as a way to challenge the program — it has to teach itself from scratch how to do something as simple as writing "Hello World!" with only eight simple commands. But calling this an AI approach is a bit of a stretch; the genetic algorithms are a brute force way of getting a desirable result. That said, a follow-up approach in which the AI was able to generate programs for accepting user input appears more promising.

Relatedly, Larry Diehl has done similar work using a stack-based language.

Barrat also told me about software that learns — programming techniques that are grouped under the term "machine learning."

The Pentagon is particularly interested in this game. Through DARPA, its hoping to develop a computer that can teach itself. Ultimately, it wants to create machines that are able to perform a number of complex tasks, like unsupervised learning, vision, planning, and statistical model selection. These computers will even be used to help us make decisions when the data is too complex for us to understand on our own. Such an architecture could represent an important step in bootstrapping — the ability for an AI to teach itself and then re-write and improve upon its initial programming. [...]

It goes on about ways that we could use to try to control AI self-evolution, and reasons why such methods may -or may not- work, and why. Read the whole thing, for many embedded links, and more.

     

Sunday, March 09, 2014

Thieves who offer Customer Suport to victims? It's called "Ransomware"

Just when you thought you'd seen it all:

'Perfect' ransomware is the scariest threat to your PC
Nothing spurs malware development like success, and that’s likely to be the case in the coming months with ransomware.

Ransomware has been around for around a decade, but it wasn’t until last fall, with the introduction of CryptoLocker, that the malevolent potential of the bad app category was realized. In the last four months of 2013 alone, the malicious software raked in some $5 million, according to Dell SecureWorks. Previously, it took ransomware purveyors an entire year to haul in that kind of money.

So is it any wonder that the latest iteration of this form of digital extortion has attracted the attention of cyber criminals? A compromised personal computer for a botnet or Distributed Denial of Service attack is worth about a buck to a byte bandit, explained Johannes B. Ullrich, chief research officer at the SANS Institute. “With ransomware, the attacker can easily make $100 and more,” he said.

What distinguishes CryptoLocker from past ransomware efforts is its use of strong encryption. Document and image files on machines infected with the Trojan are scrambled using AES 256-bit encryption, and the only way for a keyboard jockey to regain use of the files is to pay a ransom for a digital key to decrypt the data.

[...]

Honor among thieves
The CryptoLocker crew also know the value of maintaining good customer relations. “They’re honoring people who do pay the ransom,” said Jarvis, of SecureWorks.

“In most cases they’re sending the decryption keys back to the computer once they receive payment successfully,” he explained. “We don’t know what the percentage of people who successfully do that is, but we know it’s part of their business model not to lie to people and not do it.”

Moreover, in November, they began offering support to victims who, for whatever reason, fail to meet the hijackers’ ransom deadlines. By submitting a portion of an encrypted file to the bad actors at a black website and paying the ransom, a victim can receive a key to decrypt their files. “You have to reinfect yourself with the malware but once you do that, you can get a successful decryption,” Jarvis explained.

[...]

Ransomware Inc.
"It is inevitable that we will see a cryptographic ransomware toolkit,” he added, “maybe even multiple toolkits because it’s clear that there’s a business opportunity here for criminals.”

Moreover, that opportunity is likely to reach beyond the consumer realm and into the greener pastures of business. “Going after consumers is small fish,” said Bruen, of the Digital Citizens Alliance. “The next step is to conduct ransom operations on major companies. This has already happened,” he said.

“From an attacker’s perspective, there’s definitely a higher risk in getting caught because companies are going to throw more money at the problem than an ordinary consumer can,” he continued, “but the payoff from one of these companies—a Target or a Nieman Marcus—will be much larger.”

Current ransomware attacks involve encrypting select file types on a hard drive, but a business attack will likely choose a higher value target. “Cryptographic keys and digital certificates are ripe for ransom,” Venafi’s Bocek said.

"Whether it’s taking out the key and certificate that secures all communications for a bank or the SSH keys that connect to cloud services for an online retailer, keys and certificates are a very attractive target,” he observed. [...]
Welcome to the Brave new world. The orginal article has embedded links, and more details about the evolution of this software, the way it spreads, and it's potential future applications.

I've already come across a lesser "scareware" version of Ransomeware, that was mentioned in the article. It locked up one of my Linux computers, and wanted payment to unlock it, so this isn't just a Microsoft thing. I was able to get rid of it by uninstalling my browser, clearing the cache, and reinstalling Firefox. But what they are talking about in this article is much more advanced.

Scary stuff.
     

Saturday, February 15, 2014

Will software relationships replace people?

Some, like Larry Ellison, co-founder and CEO of Tech company Oracle, see a trend that suggests it's a possibility:

Billionaire Larry Ellison Warns: Be Careful Of 'Relationships With A Piece Of Software'
[...] One man asked Ellison what he thought about the role of tech in our modern lives. Ellison said he was "disturbed" by how much time kids play video games, and what that could lead to. Here's what he said:

My daughter produced a movie called "Her." It's about this guy that gets divorced and is having a rough time finding a relationship until he meets this piece of software ... it's an artificially intelligent bot, that takes no physical form.

Here's a guy that's chosen to have a relationship with a piece of software instead of a human being.

That's one way it can go. You can say that's utterly ridiculous. But I am so disturbed by kids who spend all day playing video games. They've chosen a virtual self.

This weird thing where NFL says 60 minutes a day you should go outside? I know I was a kid a long time ago, but if the sun rose, I was outside on my bike and if my parents were lucky, I would be home before dark.

The fact that people have chosen games where there's a virtual ball rather than a real ball ... that's because [games are] easy. It's very hard for me to be LeBron [James]. I was pretty good at basketball, I'm still not bad, but I'm not LeBron. Now everyone gets to be LeBron in virtual reality. But in reality only one guy gets to be LeBron.

Where does it all end? "Her" is kind of the next thing. What about virtual relationships, where your virtual partner just keeps telling you how great you are?

I won't tell you how the movie ends, but it's amazing: Be careful about virtual relationships with artificially intelligent pieces of software, that are gradually getting smarter than you are.

The truth is, the future that Ellison describes is already here. Virtual girlfriend apps are all the rage in Japan right now.
The mention of the "Virtual girlfriend apps" was a hyperlink, which lead to this:

I ‘Dated’ A Virtual Girlfriend For A Week To See What All Those Japanese Guys Are So Excited About
[...] After reading stories about the game Love Plus and how there are Japanese men who would rather date virtual ladies than real ones (one man even got married to his on-screen girlfriend), I wanted to test out what it would be like to date someone who isn’t real. I wanted test how well a gamified relationship stacked up to real life, whether I could find love — or something like it — amid the pixels and 3D animation.

Love Plus, a Nintendo DS game, is only available in Japan, so I browsed virtual dating apps in the Google Play Store. My Virtual Girlfriend was the most popular.

Here’s how the game works:

[...]
The author, a woman, tests out the virtual girlfriends extensively. Follow the link for details and screenshots. At one point, she concludes:

[...] It's easy to scoff at this game for being stupid, over-the-top, and kinda sexist.

But...

I’ve been in a real relationship for almost a year and, in some ways, playing My Virtual Girlfriend reminded me of what my boyfriend and my early dalliances felt like.

It took time and effort to progress through the levels and if I closed the app and ignored my lady for too long, she needed some sweet talk before warming back up. Starting something new isn't easy. Plus, all the girls responded differently to different things and getting to know them proved surprisingly challenging at times.

Some action-reactions were obvious, but others less so. Tell Jen a joke? She hated it. Ditto with complimenting her eyes, though admiring her smile got her to waggle her hips and giggle at me.

And her thought process was more nuanced than I would expect. After I “gave blood” to raise money to take us on a date, she chastised me for being too broke. So, when I earned the option to flash my cash later in the game, I thought I'd try it since she clearly valued money. But instead of offering her signature giggle, she just looked revolted, quickly rebuking my attempt to win her heart with money.

Unsurprisingly, she also hated my catcalling and, well, picking my nose lowered my love score too.

Unlocking new options and figuring out how to prevent my girlfriend from getting outraged and breaking up with me made me feel like she and I were growing closer, even though she was just following an algorithm. But, despite the fun, gamefied challenge of the relationship, I could never see myself developing actual feelings for any girl in the game.

Admittedly, My Virtual Girlfriend can't hold a candle to Love Plus. In that game, you have to work your way through a more complicated romance (there are only three characters with very fleshed out personalities and you start by meeting them in school). The girls can respond to your actual voice and you can kiss the screen to show affection. But, try as I might, I just couldn't find anything with more in-depth capabilities than My Virtual Girlfriend. [...]
The author talks about another program she found, that was more sexual and creepy. She said that a program like the Japanese Love Plus, isn't available in the West, probably because of cultural stigma. There may be a stigma, but for how long? Many Japanese things have crept their way into our culture. I wouldn't be surprised if "Love Plus" makes it's way here too.

She ends the article with a brief but interesting interview with creator of the "My Virtual Girlfriend" program.

Another aspect of The Brave New World is here. Are you ready for it?
   

Saturday, January 25, 2014

The Apple Mac turns 30

A Look Back at 30 Years of the Mac
The Apple Macintosh computer turns 30 on Friday; here's a look back at what made the Mac special and how it evolved over the past three decades.
In the early 1980s, the home computer and business PC revolutions were already in full swing. Apple set the template with the Apple II in 1977, while competitors Radio Shack, Atari, and Commodore followed suit. Meanwhile, in 1981 IBM introduced an artificially crippled, open-architecture, 16-bit machine called the IBM PC, which when combined with Lotus 1-2-3, took off in popularity in business environments large and small.

It was the Macintosh, though, that set the course for both kinds of computing for the next several decades. While Apple didn't invent the graphical user interface, with the Mac the company brought it to mainstream consumers for the first time. Microsoft and IBM immediately began copying its various idioms and design language—at first with a kind of hilarious ineptitude, and then in earnest beginning with Windows 3.0 in 1990 and OS/2 2.0 in 1992. The rest, of course, is history.

Today, the 30th anniversary of the Apple Mac is upon us. It goes without saying we wish Steve Jobs were still around to celebrate with all of us. So with a nod to him—and Steve Wozniak, who started Apple with Steve Jobs, and to everyone who worked on the original Mac and what followed, let's take a look back and see how we got to where we are today. [...]
What a different world it was back then. I was a Commodore 64 user at the time; the Mac was just too expensive. And IMO, it still is. Sure, their stuff IS nice, but I just won't pay that much for overpriced proprietary hardware. If they ever reverse-engineer their software to run on regular PC hardware, I'd consider it but I won't hold my breath waiting for that to happen.

(I say reverse-engineer, because the current Mac operating system is about 80% BSD code. BSD is an open source, UNIX operating system designed for the PC platform. They had to re-engineer it to work on their proprietary hardware. Apple could easily reverse-engineer it to make it run on PC hardware if they wanted to.)

But there is no denying the massive impact Apple had on computer operating systems, especially the graphical user interfaces they used. Apple set the standard for the PC GUI. This article is a real Blast from the Past, looking at how it all came together for the first time.
     

Saturday, November 02, 2013

Android 4.4 "KitKat" reduces fragmentation of the platform, emphasizing standalone apps



Android 4.4 KitKat Continues To Reduce Fragmentation Of The Platform Thanks To Google Play
With the release of Android 4.4 (codename KitKat, in an interesting marketing move with Nestle), Google has taken more control of the software stack of the mobile operating system, leaving less control in the hands of the networks and addressing the issue of fragmentation in Android.

This is not a new strategy from Google. Over the last year more and more services are becoming standalone apps in the distribution, rather than an integral part of the firmware. This means that the apps can be updated through Google Play, rather than waiting for a new version of the operating system slowly moving along the chain that stretches from Mountain View, to the handset manufactures, to the smartphone lines, and then to the carriers, before possibly being made available to the public.

Contrast that to the path that Google has for updating an app in Google Play. They simply push the updated app into the Google Play Store, and with countless Android handsets set to automatically update apps from the Play Store, then code will roll out across the ecosystem.

While many people will focus on the share of phones who have updated the firmware to the next ‘notable number’ in the Android, Google has partially sidestepped that process. There’s no longer a dependency on passing the network certification processes to roll out changes to the applications and many of the library features in Android, because they are no longer part of that package. They’re higher up the chain, and their update is completely under the remit of Google. [...]


Android 4.4 KitKat: Google's simpler, integrated operating system designed for every phone


Google Android 4.4 'Kitkat': seven things you need to know
     

Wednesday, October 09, 2013

Windows 8 continues to suck

Windows 7 outpacing Windows 8 adoption
Latest NetMarketshare figures suggest Windows 7 is outpacing Windows 8's adoption, despite a rapid reduction in Windows XP usage over the past quarter.

Over the past month, Windows 8's share has increased by 0.61 percentage points, rising to 8.02 percent of the total share. Whereas, on the other hand, Windows 7's share increased by 0.8 percentage points, rising to 46.3 percent of the market.

To put this into context, Apple's latest desktop operating system OS X 10.8 operating system grew by 0.27 percentage points to a mere 3.7 percent of the overall share. But this figure accounts for just shy of half of Windows 8's overall growth for August.

Meanwhile, Windows XP, which is set to lose Microsoft support for patches and updates in April 2014, lost a hearty chunk of share, dipping 2.25 percentage points to 31.4 percent of the overall market.

It comes at a time when Intel, as the dominant chipmaker in the PC market, may struggle in its second-half earnings, according to Sterne Agee analyst Vijay Rakesh. He warned in a note to analysts on Monday that "back to school PC demand has been virtually absent," which typically drums up mid-year sales of PCs and other devices ahead of the lucrative December holiday sales period. A drop in PC sales for the quarter will no doubt have a negative impact on the software platform market. [...]
Windows 8 is optimized for computers and tablets with touch-screens. Without a touch-screen, you have to rely on scroll-bars, which makes it slow and clunky to use. I believe this is one of the main reasons why people are sticking with windows 7; it works better for computers without touch screens. There are lots of other reasons too I'm sure, but I think that is the biggest one presently.

Some very good points are made in this article:

Five reasons why Windows 8 has failed

The article has a chart that shows how windows 8 is doing much worse than windows Vista did. Gosh, that's REALLY bad. But the five reasons given make perfect sense, to anyone who's been paying attention.
     

Thursday, August 22, 2013

Ubuntu Edge: Smartphone Convergence

Ubuntu founder Mark Shuttleworth presents his vision for the future with Ubuntu Edge, a smartphone that will transform into a PC when docked with a monitor, with a full Ubuntu Linux desktop and shared access to all the phone's files:



http://youtu.be/eQLe3iIMN7k

You can read more about what he was trying to do here, but ultimately the crowdfunding effort to launch it failed. But the vision is interesting, and there will be Ubuntu smartphones in 2014; just not the Edge. Not yet anyway.

   

Sunday, May 26, 2013

Can Windows 8 be "fixed"?

We may find out next month:

Windows 8: 5 Hopeful Signs
[...] Windows 8's consumer appeal is about to get a major upgrade.

An important note: this prediction presupposes that the OS's usability issues are addressed in Windows 8.1, a free update, formerly known as Windows Blue, expected to be revealed in June.

There's been some doublespeak from Microsoft on the usability point. Redmond executives have claimed that customer feedback informed Blue's development -- but they've also defended Win8's Live Tile start screen, which has been a significant driver of user criticism. There's a fine line between upholding one's convictions and alienating one's fans. Win 8.1 looks like it will land on the right side of that line -- but I'll come back to that later.

First, here are five reasons things are looking up for Windows 8. [...]
Read the whole thing for "reasons". I have no doubt that Microsoft will improve it. And that it won't be perfect. The question is, will it be "good enough"?
The article concludes that it ultimately will depend on increasing it's "usability".

   

Sunday, April 28, 2013

Ubuntu tablet interface enjoys success

Ubuntu 13.04 Review: Linux for the average Joe or Jane
The new Ubuntu Linux distribution, 13.04, aka Raring Ringtail, is ready to go, and for most users, it may be all the desktop they need.
True, many hard-core Linux users have turned against Ubuntu in recent years. Or, to be more precise, they turned against it when Ubuntu's parent company, Canonical, switched from the GNOME 2.x desktop to its Unity desktop interface. They have a point. Unity doesn't give Linux experts the kind of control over the operating system that they get from desktops such as KDE, MATE, and, my own personal favorite, Cinnamon.

However, Unity is not a user-experience failure like Windows 8's Metro. Instead, it's very good at what it sets out to do: Provide a user-interface (UI) that's easy enough for an 80-year old to use and provide an interface that's designed to work equally well for desktops, tablets, and smartphones. In short, Ubuntu is not for Linux power users, it's for all users.

That's very clear in Ubuntu 13.04. While this new version doesn't offer a lot of new features, it has done a nice job of cleaning and speeding up the ones it had. In particular, I noticed how this works on a review system, a 2008-vintage Gateway DX4710. This PC is powered by a 2.5-GHz Intel Core 2 Quad processor, has 6GBs of RAM, and an Intel GMA (Graphics Media Accelerator) 3100 for graphics. Unity itself was much faster than before on the same box.

That's because Ubuntu spent a lot time making performance improvements to Unity. These include: "reduced memory consumption and a great number of small UI fixes to bring a better overall shell experience. Those are like being typo-tolerant in the dashboard when searching for an application, using the mouse scroll wheel on a launcher icon to switch between applications, or better available third-party device handling."

Of course, if you really want Ubuntu, and you really can't stand Unity, there are a wide variety of Ubuntu 13.04 variants with different desktops. These include: Kubuntu, with KDE; Xubuntu, with Xfce; and Lubuntu, with LXDE. [...]
I personally don't like Ubuntu's Unity Desktop. It's a tablet-like interface, similar to Windows 8. But Unity isn't as hated as Windows 8. Why? Because it wasn't automatically put on most new PCs and forced on people, like Windows 8 was.     

Windows 8 does more than flop

It slides:

Windows 8 blamed as PC sales slide
[...] When the company introduced Windows 8 in October, the operating system was supposed to help a flagging PC market gain — or at least lose less — ground against tablets, with a new touch-based interface. Instead, sales of PCs have dropped faster, and analysts are saying that sales aren’t coming back.

“At this point, unfortunately, it seems clear that the Windows 8 launch not only failed to provide a positive boost to the PC market, but appears to have slowed the market,” said Bob O’Donnell of IDC in a press release. “Microsoft will have to make some very tough decisions moving forward if it wants to help reinvigorate the PC market.” [...]
Some people aren't replacing their PC's but are moving to tablets and smart phones. And I suspect that the people who use PCs, don't want the interfaced dumbed down to work like a tablet. At least, that's true for me.