CAL: The Controller

When we interact with the world around us, we influence it, we leave ripples, a small legacy.For almost 30 years, the biggest innovations in computing were driven primarily by the power of new machines. Only very recently has innovation focused on new ways to interact with these devices. Every connected computer interaction leaves a ripple. As with all chaos and the systems that envelope it, a single ripple or millions of concentrated ripples can change the future, can cause feedback loops, and can propagate into ever new systems that take us to unimagined worlds. The interface that enables us to communicate with computers is evolving into an immersive life experience that adds value to both CAL (Computer Aided Life) and noCAL.
It Started with Touch
There’s something special about touch.It’s intuitive, it’s primal and it’s often the very first method we use to explore and manipulate the world around us.When Nintendo released their touch-screen based DS system in 2004, they brought tactile controls to the masses. Since then, the trend has been more and more direct control of our devices. 

This kind of interface goes so much further than the classic keyboard and mouse. Instead of static buttons; gestures, pressure and context-sensitive inputs suddenly became available. Instead of operating through a layer of abstraction (the mouse pointer), we gained direct control.
Direct interfaces revolutionized how and how much we collectively use our phones, computers, and games. It made things simpler – bringing us one step closer to life, aided by computers. 


 

 

 

 

 



Bust a Move
Almost simultaneously, within the interactive media industry, we saw the sudden rise of motion controls. Again, innovation was pushed by Nintendo for their Wii console. They introducedthree-dimensional gestures and pointers. Players could use the buttons and sticks they were familiar with, but incorporate into it a new dimension of control.
Microsoft and Sony countered with the Kinect and Move respectively. Each of them representing a subtly different take on the core concept. Microsoft implemented cameras, microphones and and infrared to create complex 3D maps of the environment. This very same technology has since been used by the medical and robotics communities as an extremely cheap and effective way to tackle incredibly complex problems.
These technologies are being used to create new and engaging stories and experiences that move beyond what we’ve come to accept in our daily lives.
Far From Over
Nintendo’s latest console, due out later this year continues to push the boundaries of interactivity. The Wii U will use a touch-screen tablet controller as well as the motion controls from the previous iteration. Nintendo has touted this control mechanism as a way to achieve asymmetric multiplayer, allowing many people to all be interacting with the console in very different ways. Again, it broadens the spectrum of how we think about and understand our lives and our relationships with the technology that inhabits them. 
Valve co-founder, Gabe Newell, has dropped some hints as to what we might be seeing in the near future as well. In a recent interview, he briefly mentioned a wearable computing system that automatically overlays information based on what the user’s head and eyes are doing as well as subtler gestures than what we can currently do with commercially available technology.
Google too is jumping into the market of wearable, fully interactive technology with Google Glass. Glass can dynamically display critical information, stats, email, location, and the like all from a small, head-mounted piece that keeps everything just outside of the user’s center of vision. 
All of these technologies are bringing us that much closer to the kind of world authors and filmmakers have dreams of for decades. One company in particular is pushing the limits of what we can do with technology.
Oblong Industries has what is quite probably the most advanced I/O system currently available. Inspired by the spatial operating environment of Minority Report, it allows multiple users operating in synchrony to manipulate and control complex information in dynamic, non-linear ways. 
The Future is Coming
People have been speculating about “the future” for as long as people have been thinking about things. It might sound cliche, but we truly are on the verge of something special. For years we were bound to mechanical interfaces that are non-intuitive and need extensive training to use properly and efficiently. Now, we have the technology in a commercially-available form to begin to step away from that archaic past and begin exploring the possibilities of Computer Aided Life – CAL.
We have, in many ways, developed relationships with technology as an experience. Anything with a computer chip, computer code – laptops, desktops, consoles, portable devices, refrigerators, elevators, cars, pacemakers, televisions… is merging into what we call CAL, Computer Aided Life. 
It is tactile. It drives sound and sight and touch. It delivers the experience of motion, emotion, interaction and expression. It is becoming more personalized as it simultaneously becoming more depersonalized. It sounds odd, but it’s true for many in this generation. We rely on CAL to keep us connected, to monitor our objectives and data, to bring us art and to help us explore our world. 
LowCAL and noCAL are part of the rich fabric of our life experience. The balance, the equilibrium, the value, among CAL, lowCAL and noCAL is for each of us to choose. But that independence of choice is difficult today and will become more difficult in the future. 
There are three categories of organizations that control CAL. There are content providers such as gaming companies, publishers, multimedia accumulators, newspapers, social networks, blogs and web sites galore. There are the vertically integrated (hardware, software, content) networks such as Apple, Microsoft and Google. And there are the high bandwidth communications utilities such as cable/fiber providers, copper providers and satellite providers.  
The vertically integrated networks and the high bandwidth utilities are not seemingly interested in Nurture, Equality, Truth or Systems as part of their integral agenda. Content providers are more NETS compatible – theirs is a rich spectrum of offerings that mitigate threats of control and resultant dependence. The obvious dangers with the formation of massive clouds operated by a small number of profit driven corporations are antitrust, free enterprise and freedom of choice. If all of us must use tools and channels of communication dominated and controlled by a small number of organizations with an agenda other than NETS, then we must be eternally cautious. We should find open NETS based paths to Computer Aided Life. 
Google’s interactive experiment, Exquisite Forest, is a pioneering step toward browser based interactive Computer Aided LIfe. They deliver real-time engagement, real-time expression, real-time conversations across sound and sight and text and form. Imagine a universal CAL interface that is touch based, large screen, real time, hyper powered with toolsets for communication and creation. It will become our new playground and workground. That is where we are headed. 
The potential benefits of 1 degree of separation from CAL are amazing. Direct contact with image, with expression, with interaction, with mass participation, with systems, with engines of creation will take us toward complexities that touch our mind more than our hands. 
It’s all leading to a different kind of future, one that will allow us to dynamically collaborate and create complex models of knowledge. This new future will need a new semiotic language, and Y Worlds, built with that understanding and that goal, is set to create and fuel that journey.

CAL: The Controller

When we interact with the world around us, we influence it, we leave ripples, a small legacy.
For almost 30 years, the biggest innovations in computing were driven primarily by the power of new machines. Only very recently has innovation focused on new ways to interact with these devices. Every connected computer interaction leaves a ripple. As with all chaos and the systems that envelope it, a single ripple or millions of concentrated ripples can change the future, can cause feedback loops, and can propagate into ever new systems that take us to unimagined worlds. The interface that enables us to communicate with computers is evolving into an immersive life experience that adds value to both CAL (Computer Aided Life) and noCAL.

It Started with Touch

There’s something special about touch.
It’s intuitive, it’s primal and it’s often the very first method we use to explore and manipulate the world around us.
When Nintendo released their touch-screen based DS system in 2004, they brought tactile controls to the masses. Since then, the trend has been more and more direct control of our devices. 

This kind of interface goes so much further than the classic keyboard and mouse. Instead of static buttons; gestures, pressure and context-sensitive inputs suddenly became available. Instead of operating through a layer of abstraction (the mouse pointer), we gained direct control.

Direct interfaces revolutionized how and how much we collectively use our phones, computers, and games. It made things simpler – bringing us one step closer to life, aided by computers. 

 

 

 

 

 

Bust a Move

Almost simultaneously, within the interactive media industry, we saw the sudden rise of motion controls. Again, innovation was pushed by Nintendo for their Wii console. They introduced
three-dimensional gestures and pointers. Players could use the buttons and sticks they were familiar with, but incorporate into it a new dimension of control.

Microsoft and Sony countered with the Kinect and Move respectively. Each of them representing a subtly different take on the core concept. Microsoft implemented cameras, microphones and and infrared to create complex 3D maps of the environment. This very same technology has since been used by the medical and robotics communities as an extremely cheap and effective way to tackle incredibly complex problems.

These technologies are being used to create new and engaging stories and experiences that move beyond what we’ve come to accept in our daily lives.

Far From Over

Nintendo’s latest console, due out later this year continues to push the boundaries of interactivity. The Wii U will use a touch-screen tablet controller as well as the motion controls from the previous iteration. Nintendo has touted this control mechanism as a way to achieve asymmetric multiplayer, allowing many people to all be interacting with the console in very different ways. Again, it broadens the spectrum of how we think about and understand our lives and our relationships with the technology that inhabits them. 

Valve co-founder, Gabe Newell, has dropped some hints as to what we might be seeing in the near future as well. In a recent interview, he briefly mentioned a wearable computing system that automatically overlays information based on what the user’s head and eyes are doing as well as subtler gestures than what we can currently do with commercially available technology.

Google too is jumping into the market of wearable, fully interactive technology with Google Glass. Glass can dynamically display critical information, stats, email, location, and the like all from a small, head-mounted piece that keeps everything just outside of the user’s center of vision. 

All of these technologies are bringing us that much closer to the kind of world authors and filmmakers have dreams of for decades. One company in particular is pushing the limits of what we can do with technology.

Oblong Industries has what is quite probably the most advanced I/O system currently available. Inspired by the spatial operating environment of Minority Report, it allows multiple users operating in synchrony to manipulate and control complex information in dynamic, non-linear ways. 

The Future is Coming

People have been speculating about “the future” for as long as people have been thinking about things. It might sound cliche, but we truly are on the verge of something special. For years we were bound to mechanical interfaces that are non-intuitive and need extensive training to use properly and efficiently. Now, we have the technology in a commercially-available form to begin to step away from that archaic past and begin exploring the possibilities of Computer Aided Life – CAL.

We have, in many ways, developed relationships with technology as an experience. Anything with a computer chip, computer code – laptops, desktops, consoles, portable devices, refrigerators, elevators, cars, pacemakers, televisions… is merging into what we call CAL, Computer Aided Life. 

It is tactile. It drives sound and sight and touch. It delivers the experience of motion, emotion, interaction and expression. It is becoming more personalized as it simultaneously becoming more depersonalized. It sounds odd, but it’s true for many in this generation. We rely on CAL to keep us connected, to monitor our objectives and data, to bring us art and to help us explore our world. 

LowCAL and noCAL are part of the rich fabric of our life experience. The balance, the equilibrium, the value, among CAL, lowCAL and noCAL is for each of us to choose. But that independence of choice is difficult today and will become more difficult in the future. 

There are three categories of organizations that control CAL. There are content providers such as gaming companies, publishers, multimedia accumulators, newspapers, social networks, blogs and web sites galore. There are the vertically integrated (hardware, software, content) networks such as Apple, Microsoft and Google. And there are the high bandwidth communications utilities such as cable/fiber providers, copper providers and satellite providers.  

The vertically integrated networks and the high bandwidth utilities are not seemingly interested in Nurture, Equality, Truth or Systems as part of their integral agenda. Content providers are more NETS compatible – theirs is a rich spectrum of offerings that mitigate threats of control and resultant dependence. The obvious dangers with the formation of massive clouds operated by a small number of profit driven corporations are antitrust, free enterprise and freedom of choice. If all of us must use tools and channels of communication dominated and controlled by a small number of organizations with an agenda other than NETS, then we must be eternally cautious. We should find open NETS based paths to Computer Aided Life. 

Google’s interactive experiment, Exquisite Forest, is a pioneering step toward browser based interactive Computer Aided LIfe. They deliver real-time engagement, real-time expression, real-time conversations across sound and sight and text and form. Imagine a universal CAL interface that is touch based, large screen, real time, hyper powered with toolsets for communication and creation. It will become our new playground and workground. That is where we are headed. 

The potential benefits of 1 degree of separation from CAL are amazing. Direct contact with image, with expression, with interaction, with mass participation, with systems, with engines of creation will take us toward complexities that touch our mind more than our hands. 

It’s all leading to a different kind of future, one that will allow us to dynamically collaborate and create complex models of knowledge. This new future will need a new semiotic language, and Y Worlds, built with that understanding and that goal, is set to create and fuel that journey.