VMs — Virtual Machines

virtual-car

I got my pickup truck back from the dealer Tuesday afternoon.

It had gone in for the correction of two recalls involving control harness shorts and an air curtain control unit that had been firing off the air curtain system at random and hence, distressing, times.

Oh yeah, and an oil change.

As I covered the work on the work order with the dealer’s Service Writer, she slipped the following in at the end where maybe the hope was people wouldn’t notice it.

“Oh, yeah, we checked your ECU software, and it was way out of date, so we updated it.”

So.

They flashed my pickup.

Now I’ve run a gazillion updates on my computer.

I’ve updated wireless routers, smart phones and tablets.

And although I’ve had one such update done by replacing a memory card inside an older Engine Control Unit, this the first time I can remember somebody flashing my pickup truck.

And the truth was, not only was I not concerned about that description, but I greeted it with a sense of enthusiasm and barely contained anticipatory glee.

‘Cause frankly, my pickup truck had been kind of a disappointment.

It had exhibited all sorts of poorly implemented digital controls problems starting with god-awful throttle response to anything short of full throttle — it hated tentative or trailing throttle — surged on reopening — crappy shift points — lagging response to any inputs on the driver displays — it went on and on. What the service techs lump under ‘drivability’ — the whole human interface, was a thrown together, laggy, lumpy mess.

The transmission also had a 1-2 shift under light throttle than could only charitably be described as ‘crunchy’. If you had shifted the manual gearbox in your Dad’s car and it sounded like this, he’d have taken the keys away.

The impression you may rightly take from this is that in the matter of my pickup truck, there was a LOT of room for improvement.

Now it shall be stated for the record that I am a nerd.

I know, that when you install an update, there are no guarantees that it will be better than what had been there before.

There are even the statistical corner case updates that will render a device hopelessly inoperable, as in “Ah crap, ya bricked it.”

But in the case of this pickup truck, there were so many areas where there was room for improvement, it was hard to greet an update with anything thing other than hope.

So I went to the parking lot, plugged in my key, and flipped the starter.

“Wa-ROOOOM…”

Not Bricked.

Check.

I buckled my belt, and dropped it in gear.

I applied mild throttle — the truck has a small block overhead cam V8 — and I got pressed back in my seat and could hear the rear tires getting profoundly stressed against the pavement.

This was not how the truck I had dropped off behaved.

I ended up taking a scenic route — twisty technical roads culled from my motorcycling rides — home. The route was designed to answer the question if this update had done anything significant.

It way had.

The truck now had almost analog throttle response — 2 degrees of extra pedal travel were producing detectable change in motor response. 2 degrees more than that got more still. No off-then-on-herky-jerky stuff. Shifts of the automatic transmission were crisp and in the right place. Downshifts were easy to produce when wanted with the throttle.

Even the box of spare gears soundtrack seemed to be gone.

Now in my mind, anyway, I reach for any understanding I can come by.

Its possible that my truck had a corrupted ECU software load, but two things seem far more likely.

The first is that the truck had to be shipped before its software was ready.

The more insidious is that this is another form of VW-like mucking with the inspection system.

What if the software the vehicle is shipped with is engineered specifically to pass the EPA noise and emissions testing, but Government regulators have no structure or requirement to inspect subsequent updates to the tested software.

What you have then is one USB port loaded Get Out Of Jail Free card.

An engine that has to have its map unnaturally lean with retarded spark –reduced output — to pass noise tests can then have that map replaced with whatever is required to make the engine run its best. Pre-update, my V8 was weak off the bottom, with on-off throttle. Post update strong low end torque with near ideal linear throttle response.

Night. Day.

The entire behavior of the final complex system was almost entirely a product of software.

 

***

 

If you live in the United States, and you watch — even occasionally — the infernal televising machine, you’ve been saturated with this —

“There’s no one road out there. No one surface. No one speed. No one way of driving on each and every road. But there is one car than can conquer them all…”

And the nice folks at Mercedes Benz would like it very much if you would buy “that one car”.

Except that its a lie.

Because that one car is five cars.

Or maybe twenty five cars.

All of which are a result of choices made in software.

The commercial shows the console mounted diving mode selector button: Eco, Comfort, Sport, Sport+, Individual.

The Benz’ driving mode selector moves between defined ‘personalities’ that make changes to the suspension’s spring rates and valving, the engine’s throttle response, aggressiveness of the transmission’s shift points, and the degree of assist and ratios for the power steering.

The range of adjustment allows performance that varies between riding like a Lincoln Town Car and the characteristics of a Nissan GT-R, with a few intermediate stops in between. If the predefined options are not sufficient, you can brew up a custom gumbo of your own preferences — soft ride, slow steering and full power for your next entry into the Gumball Rally — and save them as “Individual”.

But the entire behavior of the final complex system is almost entirely a product of software.

 

***

 

When this kind of behavior crosses into the Motorcycle realm, then my interest goes from the theoretical to the practical.

And it seems that, at least with the motorcycle manufacturers that are working in partnership with Bosch, that barrier has not just been crossed, but obliterated.

Motorcycles like the KTM Superduke GT and SuperAdventure, and BMW’s new S1000XR now have all of the functionality of the aforementioned C Class Mercedes, and, frankly, more.

Much more.

The KTMs have active suspension which adjust springing and damping within user selected ranges — brake hard over a bumpy service and the suspension will adjust itself for anti-dive and to track the irregularities as the sensors perceive them. Lift the front wheel and compression damping will drop to ensure a controlled landing from the wheelie.

All of that pales in comparison to the safety systems that are unique to two wheeled machinery.

Traction controls and Anti-lock brakes now consult inertial management units (IMUs) to make different forms of correction to motorcycles that are accelerating, decelerating or leaning.  The aforementioned wheelie — one of the core joys of proficient motorcycling — can be completely dialed out by the IMU and traction control systems. ABS that would still make enough braking force to wash a front wheel while cornering is a thing of the past.

One Moto-magazine article was filled with nearly uncontrolled and uncontrollable mirth after a team of testers set out, on the Bosch test range, to prove that they were able crash the system-equipped bikes.

The best they were able to do was to come to a complete stop while leaned over in a corner — otherwise impossible, I might note — and then falling over, which had them all giggling like schoolgirls.

Motorcycle electronics have taken the next quantum leap.

Formerly, if you wanted a race bike, you needed stiff suspension, lumpy camshafts and large tuned intakes and exhausts. Tourers needed small intakes and soft suspension. People cycling off pavement needed suspension and power designed to control the machine while the drive wheel was spinning.

All of these things required divergent designs – designs that resulted in vastly different structures in metal.

Now any and all of those things are available at the touch of a button.

In a recent Cycle World review of the SuperAdventure, Blake Connor described the adjustability of the machine this way – “..the Super Adventure can be anything you want it to be. Mile-eating tourer? No problem. Roost-chucking adventurer? Yup. Wheelie-crazed hooligan? Yeah, that too. ”

Motorcycle enthusiasts like me had garages full of different motorcycles for their different moods.

Now the entire behavior of a motorcycle is almost entirely a product of software.

I’ve gone through a difficult period of adjustment where I had philosophical objections about human beings losing the skill to operate motorcycles unassisted by technology.

But the game has moved on.

We’re no longer talking about simple rider assistance.

The entire character and physics of the machine has moved from metal to software code.

What once was the most demonstrably physical of man’s inventions is now a virtual machine.

Autonomy

relaxed_self_drivers

If there’s one thing I love to do, its drive.

If there’s two things I get to love to do, its ride motorcycles.

I bring a skilled, engaged, activist perspective to both related activities. I like my cars and bikes analog, with manual gearshifts, torquey engines, firm suspensions, powerful brakes.

These machines are complex systems whose sole mission is to translate my will and skill as a driver and rider into a precise path down the road of my choosing. They are machines that amplify and multiply the speed and force of the human that controls them into a ballet of physics: acceleration, friction, g-forces.

I’ve just been on a ride through Northern Virginia’s Silicon Valley of the East, and I can tell you that I am all but alone in seeing any value in this.

***

I consider myself expert on a small range of topics. Information Technology and Internal Combustion transportation just happen to be two of them.

Where the two overlap is something that fascinates me.

In recent years, the trend has been to drive increasing amounts of automation into the management of moving vehicles.

Initially, AntiLock Braking systems were the first automation to find its way onto cars and motorcycles. The theory and practice are both simple. Use the increased processing power of sensors and computers to modulate braking far faster and more consistently than it is possible for a human to do so.

The theory is excellent.

Initially, though, the practice was awful.

The first ABS computers were simply not adequate for the task at hand. Their ability to recognize a locked wheel was simply not fast enough, and the number of cycles per second at which these systems could operate meant that, for example, at 60 miles per hour, if the system took a quarter second to release and reapply the brakes, the vehicle could cover between 80 and 100 feet with no effective braking occurring.

On a motorcycle, the cycle times of those early systems was sufficient to make sure you would likely hit something you were panic braking to avoid.

The marginal performance of those early systems was short lived, though.

Bosch’s most recent motorcycle ABS system is so proficient that – in a documented head-to-head test – it was able to consistently outbrake the Racer who won last year’s Baja 1000 offroad rally on a loose dirt surface – the worst case scenario for measuring the effectiveness of release and repressurize ABS cycle times.

***

ABS, though, was just the first salvo in what was to become an all-out e-war for control of the motor vehicle.

Traction Control – essentially ABS in reverse – was next. Control spinning tires under power – using the same sensors required for ABS — and one ‘assists’ the driver in maintaining directional control.

This notion of driver ‘assist’ started out as a passive thing – modulate some driver control inputs where necessary to maintain vehicle control.

But the notion of passive diver or rider aides very quickly changed, and in a very fundamental way. As the available processing horsepower and bandwidth of microprocessors exploded, the ‘assists’ crossed a line of demarcation where they first became active – doing things a driver didn’t or couldn’t do – and then became autonomous – in that they did things on their own while disregarding completely and input from the ‘driver’.

The first time I threw one of my family cars — a 2009 Nissan Cube – into an exit ramp in a way that made the vehicle’s dynamic stability control ‘uncomfortable’ – and the car’s systems differential braked just the two outside wheels, I knew a line had been passed.

Look at the automobile commercials you see everyday on television. SUVs that stop themselves when something crosses behind the vehicle while reversing. Automobiles that stop themselves when forward facing radars detect an imminent collision. The Mercedes-Benz commercials that depict their top of the line sedan threading its way through a gauntlet of delivery trucks hurtling simultaneously from both sides of the roadway.

The endpoint of that process is easy to divine – at a certain point, the available ‘assists’ become completely autonomous, and the ‘driver’ becomes completely superfluous.

***

I’ll make no bones about disclosing that, at least on motorcycles, I was profoundly un-enthusiastic about many of these developments.

Motorcycles adopted ‘ride-by-wire’ technologies from aviation practice where the throttle grip was no longer connected to butterflies in the intake tracts of the engine – instead it was connected to the Engine Control Unit (ECU), whose software was in charge of translating your requests to what was happening at the contact patch of the rear tire.

The fine line between a motorcycle rider and a master motorcyclist had been – pre-rider-aides – the ability to control the vehicle once one or both of the wheels started to slide. From Gary Nixon through Kenny Roberts to Valentino Rossi – the great champions of the sport were all demarked by their uncanny abilities to modulate and control a dynamic system where drift at the tire contact patches was what made them faster.

Personally, that mastery had taken me years to develop, and was much of what appealed to me in the sport. I told everyone who would listen that I was personally not prepared to simply cede that control to some outsourced software engineer in South Asia. I’ve worked with software my entire career. Errors in software fueled much of my professional life. Errors in software in this environment was something I had no desire to debug.

Following that, widespread ridicule rained on my head. I was called a Luddite. People mocked me for hypocrisy – working in technology but refusing the inevitable progress it provided. I was told that the advance of technology in motorcycling was designed to ‘free me to concentrate on more strategic decision making’. My response to this was that these technologies would essentially ensure that new riders would never have the opportunity to develop the skills to truly understand how their vehicles worked.

The technology has continued to advance – today’s Yamaha YZF-R1 street motorcycle has an Inertial Measuring Unit (IMU) – featuring gyroscopes, accelerometers and GPS – that can tell how far the motorcycle is leaned over, whether it is speeding up or slowing down, and whether the rate of acceleration or deceleration is lifting the front or rear wheel. The IMU outputs can control engine output, braking, and make fine adjustments in both the valving and springing of the suspension in real-time. This unit, which is sold as part of vehicle that retails for $16,000, is several levels of magnitude more capable than IMUs which are still deployed as part of America’s Nuclear Weapons Arsenal.

4 of the top 5 riders competing in MotoGP racing today are riders that are too young to have ever ridden a motorcycle that did not have such electronic aides. They are successful because they never developed the reflexes of self-preservation that tell one that it is not survivable to open the throttle wide open when the bike is leaned over to its limits. These guys just take the bike to its limits, roll the throttle wide open, and let the sensors and Management Units do their jobs.

That war is over, and unsurprisingly, I am vanquished.

***

So, it is more than a little surprising when I tell you today that I think that a car that drives itself is not a bad thing. ™

Automation, regardless of the area of endeavor, has always been a natural technological development for tasks that humans no longer wanted to perform.

On a commuting run today on my motorcycle, the number of people that I saw in traffic that were doing anything and everything other than driving was nothing short of staggering.

In Maryland, where I live, use of any handheld electronic device while driving is now a primary moving violation offence, for which any police officer can pull a driver over with no other violation in evidence. Compliance and enforcement are not yet at the point where I – as an unprotected road user – would like to see them, but the new law has reduced the most egregious abuses.

Unfortunately, most of my route is in Virginia, which has no such regulations.

I’ve seen people in Northern Virginia traffic with holding fixtures for their iPads mounted on their steering wheels. One such individual was – when I pulled alongside – bingewatching episodes of Breaking Bad.

I’m fortunate that a much older and very skilled and experienced motorcyclist taught me, when I was young, to actively observe the visual focus of the driver of every vehicle I approach by looking in their rearview mirrors. In 1985 it wasn’t a problem, but today, a cocked head means a phone in use and heads and eyes that are directed down inevitably mean smart phones or tablets being used to surf or send messages.

The motor vehicles of the distracted are far worse than dealing with drunks. Their lines on the road are irregular – weaving and failing to maintain lane or following distance discipline. They tend to cause traffic flows which speed up and slow down unpredictably due to their utter lack of situational awareness. They tend to hit other vehicles – either in rear end impacts or in intersections – at high closing rates without having ever braked before impact. They tend to change lanes abruptly without warning. Use of turn signals is almost unheard of.

For these people, a combination of distraction and congestion has made driving simply a dull, uninteresting chore that is to be avoided at all costs.

And for those people, the autonomous cars currently being developed by Google, Tesla and Apple are a compelling technology that will keep them, and by inference, the rest of us who are forced to share pavement with them, safer.

A surprised or inattentive human, armed with a 2 ton SUV or crossover, is the most unpredictable, and hence dangerous thing imaginable.

I have to think, understanding software and automation the way I do, that automated cars will be, if nothing else, entirely predictable in how they approach standard conditions on public highways. Their sensor arrays will make them more aware of their surroundings than the average human.

After considering the possibility for quite some time, I’ve come to the conclusion that having the robot cars as my travel companions has to compare favorably with Breaking Bad guy.

It just can’t be worse.

Just don’t be surprised if I keep a 50 year old motorcycle hidden under a tarp somewhere where the robots can’t find it.

Pardon Me, But I Believe You May Have The Wrong Disruption

tesla-650x250

Technology and economics are inexorably linked.

Understanding one, it seems, can greatly clarify one’s understanding of the other.

When a new technology works – as a business proposition – it does so because it disrupts an economy. It either dramatically changes existing flows of money within the economy or, in the optimum case, creates entirely new flows as a result of its existence.

And there have been some technologies – flashes of genius – that transform the entire economies they inhabit. Ones that have changed the way in which we do business – changed the way we live.

So trying to understand technology – which I try to do – means sometimes starting by trying to understand the underlying economics.

Which is a really roundabout way of talking about Elon Musk and Tesla Motors.

A car company, you ask?

The American Landscape is littered with ‘Visionaries’ – we almost spit when we say it – who wanted to change the automobile business.

The car business is a cute little business – we Americans buy about 20 Billion Dollars’ worth of automobiles every year. A company that was successful in taking 10% of that market would be a $2 billion dollar a year company.

One might be successful in the car business, but it wouldn’t make you Apple, for example, a company with a nearly 750 Billion Dollar market capitalization.

And it is the mention of Apple, who didn’t create a simple product, but an entire ecosystem that could grow and expand in unpredictable ways, to the point where Apple makes consumer electronics – sure – but they also make incredible revenues from entertainment, publishing and media, and now – with ApplePay – Financial Services.

People who thought Apple was a company that just wanted to make computers were wrong.

What if those people –  what if you –  are wrong about Tesla, too?

Public information about Tesla’s current projects and issued patents really tell the tale, if you can see what they’re telling us.

The Tesla Nevada Battery ‘Gigafactory’ has the capacity to produce a quantity of Lithium Ion batteries that far exceed the needs of Tesla Automobile’s most optimistic sales and production unit projections.

Why do that?

If Tesla isn’t a car company, it starts to make sense.

Tesla has already been issued patents for a series of hybrid Solar Power and Battery Storage power units, some designed for residential use, others for industrial scale. These units make use of solar panel technology from Solar City – a company whose Chairman is Elon Musk. These battery units are designed not just to charge a Tesla car, but to provide enough power to run the entire residence or business to which they are attached. A technology which not only frees its users from the gas pump, but also potentially from the public electric grid, as well.

Americans spend more than $350 billion dollars every year on gasoline.  10% of that market would be a healthier $35 billion in annual sales.

We also spend a similar amount on electric power in the US. Think another $35 billion.

Start showing some upside on that $70 odd billion of US sales, factor in the rest of the world, and it takes very little to make Tesla one of the primary movers of wealth in the whole world economy.

Nikola Tesla was a genuine visionary who thought the power of electricity moved the entire universe.

Allow me to suggest that the Tesla Motors was never about the automobile. The car’s purpose was to create a new demand – an entry point for a whole ecosystem of a different kind of energy – energy that was a personal commodity, made and used by you as you see fit.

Tesla, the company, doesn’t want to beat Ford, they’re after the Royal Dutch Shell/BPs, the Constellation energies of the world. Not a car company, but a disruptor of the world’s energy markets.

A Californian company designing some tree-hugger’s dream impractical green electric car?

Hardly.

The vision is one completely in line with that of Nicola Tesla himself – limitless electric power that is not part of some pay by the drink model where Oil Companies and Electric utilities profit endlessly from a global addiction to the energy they provide at monopoly prices.

The Tesla automobile – as impressive a technical achievement as it is, and as good a car as it is – is really just small potatoes.

Don’t say nobody told you.

To Serve Man

 

To Serve Man Twilight Zone serve5
There is so much technology woven into our modern lives that some things appear almost indistinguishable from magic.

Wave your hand at your TV or game system, and it recognizes you, greets you by name, and brings up the content and applications that are part of your personal profile.

After a recent software upgrade for my phone, I turned it on in the next morning.

“Good morning, Greg!” it said. “Is this where you live?”

I’ll admit that, like it or not, I have become an information systems-augmented mind. In the middle of conversations, or when engaged by other media – like movies, news or sports – I’ll turn to Google, or to the aforementioned creepily inquisitive phone, and either find information to give me context or fact check what I’m hearing.

With all this magic swirling in our air, I find myself wondering if anyone has devoted any thought, during the years upon years of strategizing, designing and building it, to whether all this technology is beneficial for us, the human beings.

I think about this a lot, and more and more frequently, I find myself concluding that we are collectively just not that concerned about it.

We have tech for tech’s sake. We do these things – create these technical miracles – because we can, not because we have examined their effects on us.

It recalls the old ’Twilight Zone’ episode about a race of Aliens that come to Earth, with what we assume are friendly, altruistic motives. And we continue thinking this until we discover their ‘bible’ – “To Serve Man” – is not a spiritual guide, but rather, a cookbook.

I see only parallels – our technology is assumed to be benign, but may be hiding deeper, darker outcomes.

There are lots of subtle examples. I could write a book on the adverse cognitive effects of ubiquitous text, e-mail and social messages interrupting activities that formerly required absolute intellectual focus. Reduced analytical horsepower and declining worker productivity are subtle, borderline subjective impacts, but one recent area of concentrated technological growth provides a perfect example of capability ignoring the potential insights of psychology and cognitive science.

A recent Washington Post story reported on the automotive industry’s drive to fit cars with Heads-up Displays (HUDs) to complement the in-car telematics and navigation systems.

The manufacturers claim that these systems, which project information and images into the driver’s field of view, will increase vehicular safety by reducing distractions. Cognitive science and experience seem to point to exactly the opposite. I will give massive credit to the authors of this Post piece — Drew Harwell and Hayley Tsukayama – for failing to accept this message without highlighting the significant evidence pointing to the contrary.

“Our military fighter pilots use it – it must be safe…” Actually, the military has documented something they call ‘attention capture’ – a cousin to the phenomenon known as ‘target fixation’, where a vehicle operator stops seeing anything outside the most colorful visual input – and then the vehicle goes in the direction where the operator is looking. As a result of these findings – information on the pilot’s visual field reduces pilot focus and performance – the military is rapidly de-emphasizing use of these systems.

Simple instrument displays are just the beginning – one vendor wants to project a ‘virtual car’ into the visual field so you can follow it, rather than the customary GPS arrows. Another, the Skully motorcycle helmet, wants to project this kind of information on the inside of a motorcycle helmet visor.

I consider myself expert in the area of performance motorcycling – anything which introduces even a millisecond of cognitive delay – “…is that ‘object’ real or projected?” – will adversely affect rider (or driver) performance.

Too many inbound information sources like texts and tweets want to distract you from getting tasks accomplished. In vehicle HUDs and telematics systems want distract you and get your vehicle crashed, and you, by inference, killed.

Technology is a wondrous thing, but only when its products are guided by wisdom of designers that understand what human beings really need. That quality of what I’ll call informed design has been all too absent in the last dozen or so years. We continue to do things – to push the boundaries of technology — because we can, not because we’ve done deep thinking to see if it’s really a good idea.

So the next time you pick up some new, shiny electronic thing, detach yourself for a minute and consider.

“Is this thing here to help me, or to eat me?”

Nothing

panic-now-all-your-base-are-belong-to-us-3

The field of IT Security rests on the conception that Information on a network can be secured.

What if that’s wrong, though?

What if, in point of fact, nothing can be secured?

What if saving information on any device means that anyone with a strong enough interest in knowing it probably does?

What if nothing is secure?

Well, then, that changes things.

Imagine, for a second, what it means if every piece of digital information in and describing your existence – banking records, digital photos, government records, social networks, e-commerce and location data – is easily available to anyone motivated enough to obtain it.

In the current state of information technology, that statement likely describes the actual state of information security.

It’s a broad brush, game-changing statement – what kind of evidence could one point to to possibly support such a claim?

All one needs to do is stay current with the news. One can start with all of the revelations that were provided by Edward Snowden — http://www.wired.com/2014/08/edward-snowden/ — including that all of the world’s major cellular, wireline, internet service and web organizations were systematically compromised by US Intelligence and that all of their content was siphoned off for later analysis.

Consider that for a second – A government agency that was able to build such a large security breach apparatus that it is able to suck up the entire Internet every day and save it all for later, just in case there turns out to be anything interesting in there. The extent of the lack of security was driven home by the fact that the cryptographers that the US government had graciously ‘lent’ to the organizations that constructed all of the current encryption tools were engaged for the sole purpose of implementing tools that were compromised before they were built.

The Snowden revelations are only the tip of the iceberg, though. Consider ‘Regin’ – another massive espionage toolset whose sole purpose is to systematically compromise whole networks using multiple attack and compromise modalities.   http://www.wired.com/2014/11/mysteries-of-the-malware-regin/

The large scale compromise of the Sony Pictures network in the past week raised the game and the stakes to a new level. http://www.nytimes.com/2014/12/04/business/sony-pictures-and-fbi-investigating-attack-by-hackers.html?_r=0  In what was clearly a long-term, sophisticated attack, outsiders were able to compromise systems to such a significant extent that they were able to steal substantial amounts of digital property – completed but unreleased HD feature films, scripts  — and to place agents on the network that had the potential to destroy other digital property stored on the network. IT Security experts have long feared the day when being ‘owned’ by the hackers would turn from a public relations embarrassment with secondary economic consequences to the next stage – where the primary intent was to do damage and cause direct economic harm to the target.

Today is that day.

The threat landscape to your information is so diverse that even things that seem utterly benign – charging a USB device off an available port on your computer, for example, is now a critical attack vector — https://www.yahoo.com/tech/e-cigarette-from-china-infected-mans-computer-with-103466334849.html .

If this sounds to you like a narrative of a technology in crisis, that’s because that’s exactly what it is.

So if you’re an IT guy with a network and a business to protect, what steps can be taken to at least give you a fighting chance?

Folks that I know in the intelligence business have told me that Russian Intelligence – the heirs of the old Soviet KGB – assessed this threat and went back to entirely analog communications methods. Pencil and paper. Postal Mail. Carrier pigeons. One can’t hack what isn’t stored or transmitted.

That may seem an extreme solution, but in certain cases, the principle might have applications.

In the short term, user education and the traditional defense-in-depth are the transitional solutions. Well educated users, who understand and can recognize social engineering, spear phishing and other first level compromise vectors, are the best way to keep things under lock and key. An in-house security team, or a skilled security services provider, who has an array of tools to detect and contain breaches in their early stages before the spiral out of control, is the second weapon.

In the longer term, though, some very fundamental technologies need to be completely rethought and reengineered before any acceptable level of security can be retaken from ill-intentioned Nation-State actors and cyber criminals. The IT industry as a whole needs a new architecture and new component technologies that are designed from the ground up to be secured, which the current toolset absolutely was not. Think of it as a roadmap for secure computing, and a roadmap which requires an overhaul of much of the networking and computing landscape.

Access and identity management – who are you and what can you see – needs to be completely rethought. Whether the solution is biometrics, two factor authentication or something we haven’t thought of yet, the notion that a text username and a password can get one access to any system or network resource is a concept whose fundamental failure is well understood.

Another fundamental technology – encryption – needs to be completely reengineered without any involvement from experts currently employed by the military or intelligence industries. Encryption tech needs to be secure in ways that the current protocols are not.

The Domain Name System (DNS) – is the roadmap of the internet. DNS was built from the ground up to be open and to support realtime updates. Many sustained hack attacks rely, not on compromising a target system, but on hacking DNS to redirect users to another compromised system without users being aware of it. Information that one would comfortably store on one’s own server becomes very uncomfortable if the server turns out to be someone else’s.

Operating systems – notably Windows, but Unix and Linux based OSs as well – need to be completely re-architected starting with kernel, privilege models and buffer structures which use security concerns as their foundational requirement. The very notion of a buffer overflow – enter enough crap and you’ll crash the command processor and be able to do whatever you want — that breaks overall system is a notion we’d like our children and grandchildren to be able to laugh at someday. To make that happen we need to ditch everything and start coding today.

Finally, security tools need to take a quantum leap forward in capability and accuracy. Current tools generate so many false positives that a dedicated team is needed to triage the alerts and determine which ones are genuine threats. A recent lab test of an intrusion detection tool – one I cannot name as my current employer is a partner – detected 93% of the attacks that were set loose on the control hardware. In an environment where 1 hacker success is all it takes to lose your company, 93% — while an extraordinary achievement, on one level, is completely, pitifully ineffective when viewed in terms of what these tools must really be able to do. Security Tools need to make quantum leaps to be able to analyze traffic and behaviors to reliably identify access, information theft and management privilege use by unauthorized individuals.

There was a time when I believed that all information stored on IT systems was secure unless something extraordinary occurred.

Today, I believe that nothing stored on any Information System is secure in any way.

The future belongs to those companies that understand that this is an existential threat to our civilization, and are willing to abandon everything that has gone before, and create the second revolution in Information Technology.

This time around, we need to get it right.

Until then, take good care of those carrier pigeons.

Change of Focus

bg536dff75

I’ve spent my entire working life supporting Information Technology.

And if you slow way down, back up and re-read that sentence, and then genuinely and deeply think about it, it only takes a second or two to realize just how utterly wrong and totally backwards that idea really is.

And that fundamental inversion of purpose and perspective is at the root of what is the next revolution in how we look at Information Technology and its support for businesses.

The reality is that I haven’t been supporting technology – I’ve been supporting people that make use of technology.

But the fixation on the technology itself – rather than the technology’s underlying purpose – is such an ingrained notion in the Technology and Technology Support businesses that it goes unquestioned, with virtually no one realizing that they’ve been focusing on the wrong things.

Look at any Company’s Request for Proposal document, and one sees lists of equipment – PCs, Laptops, Routers, Switches, Servers, and Security Appliances. What seems to get lost in these sales, service and outsourcing engagements are the poor humans and the work that they need to accomplish.

And that’s why this entire industry needs to completely change their focus. We’ve been focusing on the Personal Computer as the atomic unit of support.  We may occasionally shift focus to some of the back-end Infrastructure required to support those PCs, but it’s really been all about the PCs.

Organizations that want to lead in the Technology Services business need to embrace something that focuses on the human beings and the work they needs to get done. The work that people need to accomplish, after all, is what pays everyone’s bills, and locating oneself in relation to the money has never been a bad business practice. Buzzwords have never been my thing, but let’s think of it as User Centered Computing.

What are the characteristics of User Centered Computing?

Changes in the technology landscape are what enable this radical change of focus. Computing power, the solving of problems and the completion of work have moved away from the Intel processors and Windows Operating Systems that drove the personal computing revolution.  Networks have gone from drinking straw width piddlely little pipes to ubiquitous massive bandwidth – heck, 3G Wireless, which is trailing edge technology, has more speed than some of the Enterprise and Public Sector site networks I implemented early in my career.

The PC, which was formally the only place where work got done, has just become one of a number of ways that one can access the power of cloud-based, distributed computing, and viewed as a percentage of total work, is a declining portion of that access.

The work, in the simplest terms, has migrated to the network and to the cloud, and the PC has become just another screen, with tablets, smart phones and other emerging and hybrid devices all able to provide access to infrastructure based analytics and remote processing power. User Centered applications run in datacenters which have nearly infinite amounts of bandwidth with which to communicate to the user, who is now free to be almost anywhere.

So what does this mean for organizations that want to provide Information Technology Services to customers?

It is the infrastructure itself that has now become the central service – e-mail, unified messaging, voice/presence/video, collaborative and meeting applications, and web collaboration tools like Microsoft Sharepoint – all provide essential services to the workforce that is neither tied to nor is processed on a single class of computing or communications device.  The workspace has become an ecosystem of highly integrated and machine independent services.

How those services are delivered – defined in terms of the work that individuals have to do – has becomes a series of bundled services based on the type and value of tasks that those users can be expected to perform. Classes of service then become defined in terms of personas that carry with them a set of collaboration, communication and processing applications and access methods that are appropriate to those business roles.

The economics of this are where it gets genuinely compelling. With capital intensive infrastructure now borne by the service provider, one can procure these services on a purely operating expense basis.  Each persona can be delivered at a different price point, giving businesses granular control over workplace technology costs – one only pays for functions strictly where they are needed, instead of across the business. Clients also get flexibility to increase and decrease headcount without being constrained by IT investments, and get longer term predictability of their expenditures.

Access to the workspace can then be provided from any supported device. Today, even high end televisions have sufficient internet browser support to run some of these applications. In-car telematics systems currently being deployed are likely the next class of applications to be able to provide access.

The long road that started with the first barely viable versions of Microsoft Windows has created an information technology universe where one’s personal computer is far less important than one’s virtual computing environment which can be accessed with almost anything from almost anywhere. Leaders in the user centered computing space will provide clients with tightly integrated, machine independent, highly resilient and reliable services that focus on the user and what they need to achieve. The days when technology support meant worrying about keeping a fleet of PCs humming is about as cutting edge as your rangefinder film camera.

Rights in Data

Composer1 *

Today, we live in an almost entirely digital universe.

In that digital universe, we frequently buy digital ‘products’ which are assumed to be as immutable and permanent as any other material good which we might exchange for currency.

That assumption, based on a recent experience I had, would be completely and woefully incorrect.

I’m an unrepentant, unreformed musician.

My teenage band played CBGB during the second wave of do-it-yourself rock and roll. I’ve jammed with George Thorogood and John Mellencamp when their tours took them through Baltimore, and I’ve purchased a bottle of Hennessey Cognac – required by a contract rider – that I personally placed in the hands of Muddy Waters.

My house is filled with basses and guitars and more music than Doan’s has little back pills.

In the last 10 years, much of that music has been purchased through Amazon.

As a guy that really needed to keep his day job, the Systems Engineer portion of my personality has provided file server platforms in my house to keep digital forms of that music where the entire household can get to it. Of the shared directories on that 4TB file server, the largest one, by far, is the music share.

A few weeks ago, I did what I frequently do on payday, which is that I purchased a few CDs from Amazon.

(A tribute album for the late Tulsa Composer JJ Cale and a new Richard Thompson compilation, for those that wonder)

Amazon provides a MP3 version of physical recordings that one purchases, which saves me a few minutes in ripping new music to the server, and allows me to have immediate access to digital copies of new music that I’ve purchased.

Theoretically, it’s a nice value add.

Theoretically.

Because what happened next is one of the most extraordinary violations of customer rights I have seen in nearly 30 years in the Information Technology universe.

After I completed the purchase, I went to download the digital copies of my new ‘records’.

And that, as my British friends are wont to say, is where everything began to go horribly pear-shaped.

Formerly, when one wanted to download a new album, it was possible to execute the new download inside one’s Internet browser.

This time, I got a pop-up indicating the ‘the best experience’ was going to be obtained using Amazon’s Music Player application.

I should point out that it wasn’t ‘the best experience’, but rather the only experience, as the browser downloads were no longer functional.

“What could possibly go wrong?”, I thought, and I upgraded my existing copy of Amazon’s music player with the new player.

I launched the player and downloaded my new albums. As I would expect, the Amazon player identified all of my prior purchases and displayed them in the Amazon music player, including the library on my file server, which was visible on my PC as a persistently mapped network drive.

Then I closed the player, as I am accustomed to using Windows built-in media player to play all of my music, including MP3s obtained from other online sources and ripped from analog sources such as records and cassettes.

And that’s when something truly strange happened.

My new music would not play in Windows Media Player.

For a control, I checked a title that I listen to a lot. It wouldn’t play either.

Every single MP3 I’d ever obtained from Amazon – whether they had digitized them or I had – would no longer play in Windows Media Player.

Turning to the Internet, it was instantly clear that I was not alone.

The Amazon support forums made it pretty clear what was going on. The updated player had parsed the ID3 tags of every single recording I’d ever purchased from them, and written a format that the Microsoft player couldn’t read.

I’ll leave it up to you to decide whether this was sloppy system engineering and testing, or a deliberate act.

Think for a second about what I’m describing here.

A piece of software that I didn’t buy traversed my network, accessed a file server using my credentials, and overwrote the metadata of digital products I did buy, in some cases as much as ten years ago. In doing so, it made them useless for the purpose for which they were originally purchased.

FI5E5H9FEPWTWAH_MEDIUM

When you put it like that, it sounds like a crime.

That’s because it is a crime.

I’ve never seem any software license that gives a publisher a right to access and alter or destroy data that the application can access. And if I’d seen one, nobody would sign it. This is illegal access to an information system, pure and simple. People are in federal prison for exactly this right now, today.

The users of the product and the support forum have even been able to diagnose exactly what the nature of the problem is for Amazon. They’ve made use of ID3 tag editors to confirm that the Album art that the player wrote into the ID3 tags does not fully conform to the ID3 format specification. Delete the one field, and the track plays. Write the information back with an editor that does conform to the specification, and the track plays.

Trying to make Amazon fix this has not yet been successful.

After much effort, I’ve been able to speak to the Product Manager for the Amazon media player product. He admits fault while denying ill intent.

This is in no way surprising.

He offered to refund all the money I’ve ever spent on Amazon music, in order to make it up to me. His minions then refunded only the money I’ve spent on Amazon digital music which is only about 2 percent of my total expenditure – and then refunded the money to a credit card that is closed.

That really made me feel better.

I remain hopeful that the Amazon team will really fix this, but the 8 months’ worth of customer narratives on the support forums tell a different story.

So what I am faced with is a company that I’ve done really substantial business with over many years that has willfully accessed products I’ve purchased and altered them without my permission so that I can no longer use them. Whether they’ve done this to force customers to use their products or to injure their competitors doesn’t really matter.

Until they make good on this, it bespeaks an organization with an extraordinarily shocking lack of ethics and integrity, and a sheer ruthlessness of a magnitude that I’ve never seen before in my many years in the Information Technology Business.

And although Amazon is a huge company, it really gives on pause as to whether it’s really smart to run business applications or store critical information on their cloud infrastructure. If they will willingly run roughshod and trespass on my digital property on my network, I can’t imagine they would have more scruples in the case of their infrastructure.

Yeah, there’s an app for that….

app-4-that

Ok, I’d be the first to admit that my sense of irony may be too highly developed.

But I’m pretty sure that this time, it’s not me.

The legendary, foundational bluesman Howling Wolf, when speaking of his equally legendary and humble 1951 Pontiac station wagon, once said, “I own my car. It don’t own me.”

Wolf’s manifold wisdom could find applications for a number of different types of possessions today – possessions that may not be entirely clear on just who belong to who.

I have noticed a trend in the Smart Phone software development lately. It isn’t isolated, and it isn’t some sort of fluke one-off.

One of the fastest growing categories of smartphone applications are applications that are designed to manage how much time we are spending using our smartphone applications.

You may elect to read that again if you believe your brain just encountered some sort of Dr. Whovian temporal discontinuity.

That’s right. Apps that control overuse of apps.

I told you my overactive sense of irony was not responsible.

The first such application that caught my eye was in the Washington Post’s tech section:

http://www.washingtonpost.com/news/technology/wp/2014/08/21/this-app-tells-you-how-much-time-you-are-spending-or-wasting-on-your-smartphone/

This application, Moment, monitors how much time you are spending using your phone. It can set limits and ‘remind’ you if you exceed the limits you previously set.

Thinking that something like Moment had to be a fluke, I started doing a few searches, and discovered that Moment was far from alone.  

http://gigaom.com/2011/06/09/19-apps-to-boost-concentration/

In this group of Software Applications one can find applications to shut down or silence input from other applications. There are applications that block all social media updates. There are applications that time one’s ability to focus and then grant you a rest break after a certain number of ‘focused’ minutes.

Howlin'_Wolf_1972

All of which begs a question – a question which will have far greater impact if you try to imagine all six-foot-three-275-pounds of Howling Wolf looking down at you to ask it:  “Do you own your phone or does your phone own you?”

Right now, the answer, for most people, is that the phone is in charge.

The mobile phone, when it was first deployed, was supposed to be the be-all-and-end-all of business productivity tools. The pitch was that it could allow you to be reached by your customers no matter where you were, so that you were always able to conduct more business without the limitations imposed by having to be located in one’s office.

Instead, what we now have is such a ‘compelling user experience’ that a substantial number of smart phone users display the same behavioral dynamics as street junkies – compulsive inability to do anything but keep going back for another little fix. The smartphone has become the lever marked “Cheese” in the rat cage of the biggest psychology experiment ever conducted on human beings.

Today, I enjoyed my lunch outside in a large public plaza – replete with fountains, waterfalls and aquatic plants – on an eleven out of ten point scale beautiful summer day in Reston, Virginia. During that half hour, I was the only person I saw – whether seated or walking – that wasn’t completely zoned in to the screen of a smartphone or tablet.

A well-engineered smartphone is the consummate business tool. The Nokia Lumia I carry allows me to work with my corporate e-mail system, and review and comment on design, proposal and financial documents that have been generated in Microsoft’s Office software suite applications.

What most users have allowed their smartphones to become, however, is the consummate source of interruptions, distractions, and a host of stimuli that are all completely antithetical to focus and to productivity.

Your phone is supposed to work for you. If your phone or its installed software aren’t helping you work, then you are the one that needs to do a little downsizing – to make some changes in your personal organization. If you haven’t got the discipline to recognize the difference between a tool and a time wasting toy, another app is not going to help.

Riding the Front Wheel

24681

 

I love thinking and writing about technology.

I love writing and thinking about motorcycles.

It is very rare, if not unprecedented, that I encounter a topic that partakes of both.

This one does though, so reboot and then fasten your chinstrap.

Motorcyclists have a phrase that indicates a lack of ability to look far enough ahead and plan appropriately. Riders, and especially riders that compete – racers – call this “Riding the Front Wheel”.

The implication is that your attention is not pushed far enough out in front of your progress – that you are watching and reacting to things that are essentially already happening, rather than watching the horizon, and smoothly plotting a course that will take you through traffic and around obstacles with a minimum of drama.

Riding the Front Wheel results in too many direction corrections, usually the wrong ones, at the wrong times, and a very ugly and inefficient way down your course.

The fundamental nature of technology and the technology service business is that it is always changing, and astute practitioners know how to navigate within the currents of that constant change.

Working with customers to meet their technological or technology service needs is seldom a simple, point-in-time exercise. The larger the client requirement – whether it be for network switches, for storage, for outsourced support, or for consumption-based infrastructure or application services – needs to be viewed within a continuum of time.  Skilled sales practitioners seldom sell today what they have available for sale today – it’s just not that simple.

I’ve seen several hardware product manifestations of this – like anything involving time, it can go one of two vectors – forward or backward. The Reverse Time notion was exemplified by a PC manufacturer that developed and sold an ‘Enterprise Variant’ of their standard PC that designed specifically not to change over time. They recognized that the nearly constant improvements in the product were causing havoc for their customers that deployed them over time – the changes to the support and life cycle management requirements created by constant changes was making the task of the Internal IT team essentially impossible. The Forward Time notion was exemplified by one Storage Vendor who consistently sold products that didn’t yet exist and then went all but crazy trying to develop and ship products to land underneath the falling purchase orders.

The art and science of managing the flow of change in products and services is called product management. Product Managers look at the features and development effort involved in changing their products, and then make and manage plans to implement change in a controlled way over time.

Consider a customer that wants to extend secure line of business applications access to any device with an internet browser, so that their employees can do their jobs from any smartphone, tablet or laptop with an Internet Protocol network address anywhere on the planet. There are elements of this that sound nearly SciFi, and someone Riding the Front Wheel would conclude that since this service isn’t something your team delivers today, that you and your company are out of the hunt. And although you know that your company can’t actually provide this service today, but you’ve been buying some craft brews for the guys that work back in the skunk works, and you know from talking to them that you will be able to deliver this service in the future.

You make a call to the Product Manager, and she confirms that the features your customer are asking for are scheduled for release in May of next year. A few qualifying conversations with the client develop additional information that it will take them at least until July of next year to make the fiscal and support preparations.

Fast cut to Halloween of next year, and you, the guys from the Skunk Works and the Product Manager are all lifting a few jars of celebratory Coconut Porter as your client successfully wraps up their service pilot and is heading into full scale production deployment.

Now don’t think for a minute that this approach is not without risks of its own. We all know that time is a big ball of wibbley wobbley timey wimey stuff, and that the assumption that cause always follows effect in a completely linear fashion is bound to produce a few surprises. It’s not like a product development team has never missed a deadline, so one needs to be prepared to deal with some slippage and non-deterministic behavior.

But Riding the Front Wheel, being fixated on what exists at the time, you would have been found crushed under your bike, pinned to large desert cactus. But you had your head up, had your eyes on the horizon, and could plan to be in the right place and ready to execute, when the time was finally right. You get to see the winner’s checkers, stand on the podium and taste the champagne.

Tell ‘Em Before They Ask…or Else!

warning_notification-1331px-300x289

If you ask most IT Directors what single system they support that carries with it the highest level of stress, you’ll likely get a pretty uniform response – their e-mail system.
This point was driven home like a wooden stake in the heart of the vampire last week when Microsoft’s Cloud e-mail offering, Exchange Online – part of their Office 365 suite – decided to go tango uniform for more than 5 hours.

Most people that make a decision to move their e-mail operations do so for one of two reasons. The statistically more likely reason is that properly implementing and operating e-mail systems involves a great deal of both capital investment and operating expense. Like it or not, e-mail is mission critical – the vast majority of business transactions today take place and are documented by the organization’s e-mail system. E-mail downtime equals paralysis, pure and simple.

Designing in and then implementing the proper degree of hardware and software fault tolerance is staggeringly expensive, and employing the people that understand and can effectively manage that complex infrastructure is only slightly less so. In startup or small organizations making the necessary expenditures can be prohibitively expensive, from both the capital investment and operating expense perspectives.

The second reason is almost a logical outgrowth of the first – uptime and availability is a function of both investment and expertise – and smaller organization should be able to exploit that advanced expertise and larger, leveraged investments. At least in theory, the major providers of cloud e-mail services – Microsoft, Google, et al – should be masters of that discipline. Looked at dispassionately, Microsoft should be able to design and operate a multi-site fault tolerant service that can take a direct hit, lose a site, and no one will ever notice.

Operative concept here is should.

One of my favorite writers on the subject of engineering is Kevin Cameron, who is a mechanical engineer by trade. Kevin is a large, versatile mind, and his thoughts on design principles and practice have applicability far beyond the internal combustion subjects about which he writes.

One of his recent columns was examining the systems integration issues being encountered by new forms of multinational production, especially those being seen in the commercial aircraft industry. Kevin concluded that the types of systems now being designed are so staggering complex, and their components have such an extraordinary range of interactions with each other, that engineers no longer have the ability to predict the types of system failures that can occur – essentially, that systems have become so complex, that when one component failure starts to cascade to other system components, that the collapse of the system is a completely unpredicted event that may cause large portions of the systems to have to be redesigned to accommodate it.

Kevin might have been talking about the new Boeing Dreamliner, but he just as well might have been talking about any complex cloud infrastructure – both Exchange Online and Amazon Web Services have both experienced widespread system outages that their architects would likely have told you were impossible before they occurred.

Everything built by the hand of man can and will break. The odds of such failures – if proper precautions have been taken – may be infinitesimally low, but they will occur.

Good IT service providers are fully prepared to communicate definitive information to their customers when stuff under their control, and on which their customers depend, inevitably breaks.

My employer’s IT Service contracts define extremely specific notification procedures and methods to be used in the event of a failure with the potential to impact the customer’s business. Those commitments list specific individuals to be contacted, the times they will be provided with initial notification and updates, and the specific communications methods that will be used to make those notifications.

When people entrust you with their business, the most important thing they want to know is that you, as an IT Service Provider, understand the disruption and problems that a service outage is causing. They want to know that you are aware of the scope of a problem, that you have a plan to address it and are working that plan, and your best estimates as to when service is going to be restored.

Contrast that best practices approach with what occurred during the Microsoft Exchange Online outage. Astoundingly, with untold tens of thousands of folks depending on the service, Microsoft didn’t appear to have an out-of-band method already defined to proactively communicate with their users.

“I know… we’ll just send out an e-mail…..oh….waitasecond…..um….”

When they did finally determine a method to communicate with their customers – several hours into the event – we were treated to the unseemly situation of Microsoft resorting to the use of Twitter … one of the biggest technology companies in the world using someone else’s application….to get the word out. This event and the way that Microsoft shared information with their mission-critical customers made them look unprepared and frankly, like rank amateurs.

Everyone makes mistakes. It’s how one responds to those mistakes that demonstrates the quality and character of an organization.

If you go to the market to purchase any sort of IT Service, you should rightfully expect that your service provider will have predetermined and failsafe methods already defined to notify you of any issue that has the potential to impact your business. You should also expect that those methods and escalation paths should be documented and part of your agreement with them.

A service provider that doesn’t think that proactive communication around outages and return to service isn’t a critical part of what they should be doing for you is either guilty of extreme hubris, delusional, or maybe both.