"Tablet Evolution", Presented by #Motorola {VIDEO}

The Tablets Are Coming...And Perhaps Honeycomb Will Change The Game.

Let's Hope...

Posted via email from cabezas

Thursday, September 9, 2010

{CHART} Facebook Surpasses Google In Time Spent On Site For First Time Ever; Yahoo Trending Downward...

From BusinessInsider.com:

" If Google wasn't already scared of Facebook, this ought to do the trick.
Time spent on Facebook was greater than time spent on Google sites in the U.S. in August for the first time in history, according to fresh data from comScore.

Meanwhile, Yahoo continues its slide from the top of the heap to the bottom. "

Posted via email from cabezas

Friday, September 3, 2010

Wednesday, July 28, 2010

" Remember That A Person's Name Is To That Person The Sweetest And Most Important Sound In Any Language. " ~ Dale Carnegie http://bit.ly/dCG25K

Thursday, July 22, 2010

" Your Vision Will Become Clear Only When You Can Look Into Your Own Heart. Who Looks Outside, Dreams. Who Looks Inside, Awakens. " ~ Carl Jung http://bit.ly/9rMVjt

Tuesday, July 20, 2010

'Ride The Lightning'...Spectacular #Lightning Over #Athens, #Greece {IMAGE}

{ Special Thanks to The Dude Dean... }

Originally from:  http://apod.nasa.gov/apod/ap100720.html 

Lightning Over Athens 
Credit & Copyright: Chris Kotsiopoulos

Explanation: Have you ever watched a lightning storm in awe? Join the crowd. Oddly, nobody knows exactly how lightning is produced. What is known is that charges slowly separate in some clouds causing rapid electrical discharges (lightning), but how electrical charges get separated in clouds remains a topic of much research. Lightning usually takes a jagged course, rapidly heating a thin column of air to about three times the surface temperature of the Sun. The resulting shockwave starts supersonically and decays into the loud sound known as thunder. Lightning bolts are common in clouds during rainstorms, and on average 6,000 lightning bolts occur between clouds and the Earth every minute. Pictured above, an active lightning storm was recorded over AthensGreece earlier this month.

Posted via email from cabezas

Monday, July 19, 2010

" Social Media Only Becomes A Priority When You Understand The Multiplicity Of Benefits From It To You... " ~ @michaelgass http://bit.ly/9oYLBd

Sunday, July 18, 2010

"Things Are Seldom What They Seem; Skim Milk Masquerades As Cream." ~ William S. Gilbert :^)~ (thx Joe Stirt!) http://bit.ly/bFPLu3

Sunday, June 6, 2010

"Does The Internet Make You Smarter / Dumber ?" By Clay Shirky / Nicholas Carr

Published by http://online.wsj.com on Saturday, June 5, 2010, as The Saturday Essay

By Clay Shirky { 'Smarter' }
By Nichos Carr { 'Dumber' }

Images by:  Charis Tsevis (Shirky); Mick Coulas (Head in Front of Keyboard)

'Smarter' { by Clay Shirky }

Digital media have made creating and disseminating text, sound, and images cheap, easy and global. The bulk of publicly available media is now created by people who understand little of the professional standards and practices for media.

Instead, these amateurs produce endless streams of mediocrity, eroding cultural norms about quality and acceptability, and leading to increasingly alarmed predictions of incipient chaos and intellectual collapse.

But of course, that's what always happens. Every increase in freedom to create or consume media, from paperback books to YouTube, alarms people accustomed to the restrictions of the old system, convincing them that the new media will make young people stupid. This fear dates back to at least the invention of movable type.

As Gutenberg's press spread through Europe, the Bible was translated into local languages, enabling direct encounters with the text; this was accompanied by a flood of contemporary literature, most of it mediocre. Vulgar versions of the Bible and distracting secular writings fueled religious unrest and civic confusion, leading to claims that the printing press, if not controlled, would lead to chaos and the dismemberment of European intellectual life.

These claims were, of course, correct. Print fueled the Protestant Reformation, which did indeed destroy the Church's pan-European hold on intellectual life. What the 16th-century foes of print didn't imagine—couldn't imagine—was what followed: We built new norms around newly abundant and contemporary literature. Novels, newspapers, scientific journals, the separation of fiction and non-fiction, all of these innovations were created during the collapse of the scribal system, and all had the effect of increasing, rather than decreasing, the intellectual range and output of society.

To take a famous example, the essential insight of the scientific revolution was peer review, the idea that science was a collaborative effort that included the feedback and participation of others. Peer review was a cultural institution that took the printing press for granted as a means of distributing research quickly and widely, but added the kind of cultural constraints that made it valuable.

We are living through a similar explosion of publishing capability today, where digital media link over a billion people into the same network. This linking together in turn lets us tap our cognitive surplus, the trillion hours a year of free time the educated population of the planet has to spend doing things they care about. In the 20th century, the bulk of that time was spent watching television, but our cognitive surplus is so enormous that diverting even a tiny fraction of time from consumption to participation can create enormous positive effects.

Wikipedia took the idea of peer review and applied it to volunteers on a global scale, becoming the most important English reference work in less than 10 years. Yet the cumulative time devoted to creating Wikipedia, something like 100 million hours of human thought, is expended by Americans every weekend, just watching ads. It only takes a fractional shift in the direction of participation to create remarkable new educational resources.

Similarly, open source software, created without managerial control of the workers or ownership of the product, has been critical to the spread of the Web. Searches for everything from supernovae to prime numbers now happen as giant, distributed efforts. Ushahidi, the Kenyan crisis mapping tool invented in 2008, now aggregates citizen reports about crises the world over. PatientsLikeMe, a website designed to accelerate medical research by getting patients to publicly share their health information, has assembled a larger group of sufferers of Lou Gehrig's disease than any pharmaceutical agency in history, by appealing to the shared sense of seeking medical progress.

Of course, not everything people care about is a high-minded project. Whenever media become more abundant, average quality falls quickly, while new institutional models for quality arise slowly. Today we have The World's Funniest Home Videos running 24/7 on YouTube, while the potentially world-changing uses of cognitive surplus are still early and special cases.

That always happens too. In the history of print, we got erotic novels 100 years before we got scientific journals, and complaints about distraction have been rampant; no less a beneficiary of the printing press than Martin Luther complained, "The multitude of books is a great evil. There is no measure of limit to this fever for writing." Edgar Allan Poe, writing during another surge in publishing, concluded, "The enormous multiplication of books in every branch of knowledge is one of the greatest evils of this age; since it presents one of the most serious obstacles to the acquisition of correct information."

The response to distraction, then as now, was social structure. Reading is an unnatural act; we are no more evolved to read books than we are to use computers. Literate societies become literate by investing extraordinary resources, every year, training children to read. Now it's our turn to figure out what response we need to shape our use of digital tools.

The case for digitally-driven stupidity assumes we'll fail to integrate digital freedoms into society as well as we integrated literacy. This assumption in turn rests on three beliefs: that the recent past was a glorious and irreplaceable high-water mark of intellectual attainment; that the present is only characterized by the silly stuff and not by the noble experiments; and that this generation of young people will fail to invent cultural norms that do for the Internet's abundance what the intellectuals of the 17th century did for print culture. There are likewise three reasons to think that the Internet will fuel the intellectual achievements of 21st-century society.

First, the rosy past of the pessimists was not, on closer examination, so rosy. The decade the pessimists want to return us to is the 1980s, the last period before society had any significant digital freedoms. Despite frequent genuflection to European novels, we actually spent a lot more time watching "Diff'rent Strokes" than reading Proust, prior to the Internet's spread. The Net, in fact, restores reading and writing as central activities in our culture.

The present is, as noted, characterized by lots of throwaway cultural artifacts, but the nice thing about throwaway material is that it gets thrown away. This issue isn't whether there's lots of dumb stuff online—there is, just as there is lots of dumb stuff in bookstores. The issue is whether there are any ideas so good today that they will survive into the future. Several early uses of our cognitive surplus, like open source software, look like they will pass that test.

The past was not as golden, nor is the present as tawdry, as the pessimists suggest, but the only thing really worth arguing about is the future. It is our misfortune, as a historical generation, to live through the largest expansion in expressive capability in human history, a misfortune because abundance breaks more things than scarcity. We are now witnessing the rapid stress of older institutions accompanied by the slow and fitful development of cultural alternatives. Just as required education was a response to print, using the Internet well will require new cultural institutions as well, not just new technologies.

It is tempting to want PatientsLikeMe without the dumb videos, just as we might want scientific journals without the erotic novels, but that's not how media works. Increased freedom to create means increased freedom to create throwaway material, as well as freedom to indulge in the experimentation that eventually makes the good new stuff possible. There is no easy way to get through a media revolution of this magnitude; the task before us now is to experiment with new ways of using a medium that is social, ubiquitous and cheap, a medium that changes the landscape by distributing freedom of the press and freedom of assembly as widely as freedom of speech.

' Dumber ' { by Nicholas Carr }

The Roman philosopher Seneca may have put it best 2,000 years ago: "To be everywhere is to be nowhere." Today, the Internet grants us easy access to unprecedented amounts of information. But a growing body of scientific evidence suggests that the Net, with its constant distractions and interruptions, is also turning us into scattered and superficial thinkers.

The picture emerging from the research is deeply troubling, at least to anyone who values the depth, rather than just the velocity, of human thought. People who read text studded with links, the studies show, comprehend less than those who read traditional linear text. People who watch busy multimedia presentations remember less than those who take in information in a more sedate and focused manner. People who are continually distracted by emails, alerts and other messages understand less than those who are able to concentrate. And people who juggle many tasks are less creative and less productive than those who do one thing at a time.

The common thread in these disabilities is the division of attention. The richness of our thoughts, our memories and even our personalities hinges on our ability to focus the mind and sustain concentration. Only when we pay deep attention to a new piece of information are we able to associate it "meaningfully and systematically with knowledge already well established in memory," writes the Nobel Prize-winning neuroscientist Eric Kandel. Such associations are essential to mastering complex concepts.

When we're constantly distracted and interrupted, as we tend to be online, our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory.

In an article published in Science last year, Patricia Greenfield, a leading developmental psychologist, reviewed dozens of studies on how different media technologies influence our cognitive abilities. Some of the studies indicated that certain computer tasks, like playing video games, can enhance "visual literacy skills," increasing the speed at which people can shift their focus among icons and other images on screens. Other studies, however, found that such rapid shifts in focus, even if performed adeptly, result in less rigorous and "more automatic" thinking.

In one experiment conducted at Cornell University, for example, half a class of students was allowed to use Internet-connected laptops during a lecture, while the other had to keep their computers shut. Those who browsed the Web performed much worse on a subsequent test of how well they retained the lecture's content. While it's hardly surprising that Web surfing would distract students, it should be a note of caution to schools that are wiring their classrooms in hopes of improving learning.

Ms. Greenfield concluded that "every medium develops some cognitive skills at the expense of others." Our growing use of screen-based media, she said, has strengthened visual-spatial intelligence, which can improve the ability to do jobs that involve keeping track of lots of simultaneous signals, like air traffic control. But that has been accompanied by "new weaknesses in higher-order cognitive processes," including "abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination." We're becoming, in a word, shallower.

In another experiment, recently conducted at Stanford University's Communication Between Humans and Interactive Media Lab, a team of researchers gave various cognitive tests to 49 people who do a lot of media multitasking and 52 people who multitask much less frequently. The heavy multitaskers performed poorly on all the tests. They were more easily distracted, had less control over their attention, and were much less able to distinguish important information from trivia.

The researchers were surprised by the results. They had expected that the intensive multitaskers would have gained some unique mental advantages from all their on-screen juggling. But that wasn't the case. In fact, the heavy multitaskers weren't even good at multitasking. They were considerably less adept at switching between tasks than the more infrequent multitaskers. "Everything distracts them," observed Clifford Nass, the professor who heads the Stanford lab.

It would be one thing if the ill effects went away as soon as we turned off our computers and cellphones. But they don't. The cellular structure of the human brain, scientists have discovered, adapts readily to the tools we use, including those for finding, storing and sharing information. By changing our habits of mind, each new technology strengthens certain neural pathways and weakens others. The cellular alterations continue to shape the way we think even when we're not using the technology.

The pioneering neuroscientist Michael Merzenich believes our brains are being "massively remodeled" by our ever-intensifying use of the Web and related media. In the 1970s and 1980s, Mr. Merzenich, now a professor emeritus at the University of California in San Francisco, conducted a famous series of experiments on primate brains that revealed how extensively and quickly neural circuits change in response to experience. When, for example, Mr. Merzenich rearranged the nerves in a monkey's hand, the nerve cells in the animal's sensory cortex quickly reorganized themselves to create a new "mental map" of the hand. In a conversation late last year, he said that he was profoundly worried about the cognitive consequences of the constant distractions and interruptions the Internet bombards us with. The long-term effect on the quality of our intellectual lives, he said, could be "deadly."

What we seem to be sacrificing in all our surfing and searching is our capacity to engage in the quieter, attentive modes of thought that underpin contemplation, reflection and introspection. The Web never encourages us to slow down. It keeps us in a state of perpetual mental locomotion.

It is revealing, and distressing, to compare the cognitive effects of the Internet with those of an earlier information technology, the printed book. Whereas the Internet scatters our attention, the book focuses it. Unlike the screen, the page promotes contemplativeness.

Reading a long sequence of pages helps us develop a rare kind of mental discipline. The innate bias of the human brain, after all, is to be distracted. Our predisposition is to be aware of as much of what's going on around us as possible. Our fast-paced, reflexive shifts in focus were once crucial to our survival. They reduced the odds that a predator would take us by surprise or that we'd overlook a nearby source of food.

To read a book is to practice an unnatural process of thought. It requires us to place ourselves at what T. S. Eliot, in his poem "Four Quartets," called "the still point of the turning world." We have to forge or strengthen the neural links needed to counter our instinctive distractedness, thereby gaining greater control over our attention and our mind.

It is this control, this mental discipline, that we are at risk of losing as we spend ever more time scanning and skimming online. If the slow progression of words across printed pages damped our craving to be inundated by mental stimulation, the Internet indulges it. It returns us to our native state of distractedness, while presenting us with far more distractions than our ancestors ever had to contend with.

Posted via email from cabezas

Saturday, May 29, 2010

Good Bye, Easy Rider. #DennisHopper

From http://npr.org by Jesse Baker

The much-loved American filmmaker and actor Dennis Hopper died Saturday at his home in Venice, Calif., seven months after his manager announced that he had been diagnosed with prostate cancer. He was 74.

Early in his career, Hopper shared the screen with the likes of James Dean in 1955's Rebel Without a Cause and Elizabeth Taylor in the 1956 epic Giant; later he worked with Paul Newman in Cool Hand Luke and John Wayne in the 1969 Western True Grit. And though he started out a long way from Hollywood — in Dodge City, Kan., where he was born in 1936 — metaphorically the movies were always with him.

"I was raised at the end of the Dust Bowl, and I used to tell people the first light I saw was not from the sun but from the light of a movie projector," Hopper told Fresh Air host Terry Gross in a 1996 interview.

Hopper's directorial debut came in 1969, when fellow actor Peter Fonda came to him with an idea for a film.

"You direct, I'll produce, and we'll both ride and act in it," Fonda recalled telling Hopper. "You've got the passion, you understand framing. You go for it!"

Set in the wide-open spaces of the American Southwest, Easy Rider was about two freewheelers who ride their motorcycles from Los Angeles to New Orleans. It was all drugs and rock 'n' roll — and it made for a box-office hit. Hopper was intoxicated by the freedom that came with putting together a low-budget, self-made movie, and his directorial debut became a trailblazer for independent films in the 1970s.

In the wake of Easy Rider, and the best-screenplay Oscar nomination that came with it, Hollywood thought it had found its new golden-boy director. Money for Hopper's next project, The Last Movie,came rolling in — but the project didn't quite live up to expectations. In fact, it was a total failure: Addiction plagued Hopper during filming and post-production, says New York University film-studies professor Robert Sklar, and he soon lost himself in the editing of the picture.

"But it shaped his career, in a way," Sklar says. "He went from the top to the bottom in about the space of two years, and he spent a lot of time trying to come back."

Indeed, much of Hopper's story as an actor involves his trying to clean up, clear up and make a comeback.

"It's too easy to justify using drugs and drinking because you're an artist," he told Fresh Air's Terry Gross, in a conversation about his battles with addiction. "I can't cop to that excuse."

But Hopper did go on to enjoy a number of comebacks, and not just in film. Indeed, throughout his life, Hopper defined himself as more than just an actor. As a child, he took art lessons from the painter Thomas Hart Benton; he went on to make friends with art-world titans like Jasper Johns, Andy Warhol and David Hockney, and to become a serious painter, sculptor and collector himself.

At James Dean's urging, he'd taken up photography in his teens, and with his camera he documented everything from Berkeley hippie love-ins to the 1963 March on Washington. In his later years, he transformed vintage photos of his friends and colleagues — Paul Newman, Bill Cosby, pop artists and politicians — into billboard-size oil-on-vinyl paintings. Making pictures was a real passion for Hopper; when he played a crazed photojournalist in Francis Ford Coppola'sApocalypse Now, Peter Fonda noted that the character was remarkably similar to Hopper's real-life persona.

Countless other film projects followed, including The Texas Chainsaw Massacre 2 (1986), Blue Velvet (1986), the notoriously expensive sci-fi flop Waterworld (1995) and the beloved sports drama Hoosiers (1986) — which won him another Oscar nomination, this time for best supporting actor.

"If we go back and look at his career, there are lots of interesting discoveries to be found," says NYU's Robert Sklar. "He isn't only Easy Rider, he isn't only Apocalypse Now, he isn't only Blue Velvet — there is so much more to think about."

Posted via email from cabezas

'STEP OFF, I'M DOING THE HUMP!!' Celebrating 20 Years of "The Humpty Dance" {VIDEO} #HumptyDance

From Rolling Stone, http://rollingstone.com by Rob Sheffield

" He got stupid. He shot an arrow like Cupid. He used words that didn’t mean nothing, like "loopid." Humpty Hump was his name and he single-handedly saved the summer of 1990, easily the worst radio summer of all time. Let us now celebrate 20 years of "The Humpty Dance," summer jam of summer jams, the song that has kept a grateful nation getting busy in Burger King bathrooms ever since.

Sometimes a summer song fights its way out of a crowded pack. Like, what was the summer song of 1984 or 2003? You might say "When Doves Cry" or "Crazy In Love," while someone else would go for "Ghostbusters" or "Seven Nation Army," maybe even "Missing You" or "Ignition (Remix)." But any fan can agree these were jam-packed summers for pop radio. Other years, there’s a clear-cut champ — no matter what else you cranked on the beach in 1994, there was only one "Gin and Juice," and despite all the great tunes kicking around in 2006, Gnarls Barkley’s "Crazy" was first among equals. But 1990 was a dismal little sweatbox — you’d have to reach back to the pre-Beatles era to search for a radio summer that weak. There was only one song everyone could agree on, one song you could blast at a party without driving everyone out to the porch. And that song was "The Humpty Dance."

Digital Underground had already released a fantastic hip-hop twelve-inch in the late summer of ’89, "Doowutchalike," which built up huge anticipation for their debut album, Sex Packets. Humpty Hump had a cameo in that tune, saying, "Homegirls, for once, forget you got class! See a guy you like, just grab him in the biscuits!" When the Bay Area crew finally dropped Sex Packets that winter, everyone seemed to feel hugely disappointed and forgot about it. But a few months later, "The Humpty Dance" began showing up on MTV and the radio, and blew everything else away. That soles-of your feet bassline. That "Do me baby!" hook. That tone-deaf male chorus, that irresistible "let’s get stupid" gear change at the end, all those cornball jokes about how he likes his oatmeal (lumpy) and his beats (funky). Humpty had the whole song to himself, crowing, "Both how I’m living and my nose is large!"

There were a few other worthy hip-hop singles that summer — A Tribe Called Quest’s "Can I Kick It," YZ’s "Tower With The Power," Roxanne Shante’s "Brothers Ain’t Shit"— but none of them came close to going pop. And Top 40 radio was in sorry-ass shape — this was a time when Nelson could look like a sign of life. MC Hammer’s "U Can’t Touch This" still lingered on from the spring. Wilson Phillips and En Vogue both scored horrific hits called "Hold On." Jon Bon Jovi tried to get serious with "Blaze of Glory." There were niche hits that people loved to argue about: Deee-Lite’s "Groove Is In The Heart" (which I loved), Snap’s "The Power" (which I hated), Madonna’s "Hanky Panky" (which has to be one of the strangest Top 10 hits in history). But it was "The Humpty Dance" that kept us sane enough to keep the arguments going. All summer long, I went to house parties where they played it six times. At one of these parties I tried spinning Public Enemy’s "911 Is A Joke," but it instantly cleared all females off the dance floor, pissing off my housemates yet giving us all a new appreciation for Humpty Hump.

Humpty seemed to go hand in hand with the "Black Bart" t-shirts that were equally ubiquitous that summer. The Simpsons had only been on the air for a few months, yet everywhere you went, you saw bootleg Black Bart shirts. (The other big t-shirt that summer? Washington D.C. mayor Marion Barry had just gotten busted smoking crack on camera, so if you lived in D.C., Virginia or the Carolinas, you saw a lot of "THE BITCH SET BARRY UP" shirts.) Like Bart, Humpty became part of the culture, an Eshu-Elegba trickster god in Groucho drag, the stuff of legend. He made occasional reappearances in Digital Underground’s music — he got married in "Tie the Knot," and defended his nose as an Afrocentric political statement in "No Nose Job." But we all owe him for "The Humpty Dance." Thank you, Humpty Hump. All over America, Burger King employees are still mopping up after you. "

Posted via web from cabezas

Thursday, May 27, 2010

Adidas + Technology = 2010 Fifa World Cup #WorldCup #Soccer #Futbol #Adidas

By Linda Tischler, from http://fastcompany.com

" Other manufacturers may produce colorful balls for next month's FIFA World Cup, but there's only one official ball, and for the 11th year, Adidas earned the right to field its version of what a world-class ball should look like.

This tournament's ball, called "Jabulani," which means "to celebrate" in Bantu, represents advances in both design and innovation.

Rather than being made of leather, which is traditional, the Jabulani ball is constructed of synthetics. Instead of 14 panels, there are only eight, which are held together by thermal bonding, not hand stitching.

That "grip 'n' groove" technology makes for improved wind channeling and, thus, a truer flight, Adidas officials say. Fewer seams also translate into a greater striking surface, making the Jabulani the roundest and most accurate soccer ball ever created.

As a result, this ball is faster than ever -- potentially making for higher-scoring games. That's a plus for markets, like the U.S., where less soccer-savvy audiences are less appreciative of a sophisticated defense than of the primal thrill of a boot into the net.

But it's likely to lead to frustrated goalies, who have already started to whine about it. Kasey Kelly, a U.S. goalie, told the Wall St. Journal that the ball is too unpredictable and thinks the sport should just decide on a ball, and forget futzing around with innovation. Take away that man's iPad!

Apart from the technical finesse this ball represents, its design was also conceived to pay homage to the African continent's first crack at hosting the games.

Following the 2006 World Cup in Germany, Adidas dispatched designers to Africa to begin gathering data for the 2010 redesign. "Designers showed us videos from their trips," says Antonio Zea, director of soccer for Adidas America. "They had pictures of fans who create these hard hats decorated with dioramas about their teams that expressed their passion for the game."

One of the factors influencing the ball's design was South Africa's diversity -- its various climates, tribes, and languages. The number 11 turns out to have been seminal: "There are 11 players on a soccer team, 11 distinct tribes in South Africa, 11 languages spoken, and this is our 11th time to furnish the World Cup ball," Zea says. To honor all that, Adidas used 11 colors on the ball and a graphic image that represents the Soccer City Stadium in Johannesburg.

Adidas will also outfit 12 federations at the World Cup, more than any other brand. In addition to South Africa, they'll include Mexico, Germany, Argentina, Japan, Spain, France, Nigeria, Paraguay, Denmark, France, and even Greece.

Adidas jerseys will feature the company's high performance compression TECHFIT technology, in which various bands around the shirt improve speed, power, endurance, and vertical jumping ability. Adidas says the new technology -- soccer's answer to the Olympics's controversial swimsuits -- can improve a player's power by 5.3%, his vertical leaping ability by 4%, and his sprinting speed by 1.1%.

"The bands minimize muscle vibration, which minimizes fatigue," Zea says, adding that the company also makes TECHFIT underwear.

Asked about the potential for controversy -- which broke out already when the French team wore TECHFIT gear when playing Ireland -- Zea is philosophical. "Our innovation gives players a slight edge, but still allows them to be part of the team."

Zea says replica balls and jerseys will be on sale at soccer specialty retailers. Adidas is also promising a huge digital push as part of the tournament, with lots of social networking to engage fans.

Below are the F50 adiZero cleats, which will be worn by World Player of the Year Lionel Messi (Argentina) and U.S. forward Jozy Altidore at the 2010 FIFA World Cup. "

Pictures taken from http://fastcompany.com

Picture 1:  Adidas' Jabulani ball

Picture 2:  Adidas' Argentina & France team jerseys

Picture 3:  Adidas' F50 adiZero cleats

Taken from the full story at http://fastcompany.com


Posted via email from cabezas

Tuesday, May 18, 2010

"The Future Of TV Is: There Is NO TV (Just Different Screen Sizes) ~Avner Ronen, Boxee {Video} #PSFK #DigitalVideo #Boxee

From the PFSK Conference New York 2010/psfk.com:

" Some highlights from his talk:

- “The internet rather than proprietary networks will be the backbone for video (tv is just one more connected screen).” Boxee is a great example of how this would be implemented. As a mutli-media browser that combines the “worlds of The Internet and television,” Boxee allows users to stream entertainment from across inputs, channels and platforms (i.e. LAN shares, DVD, and online services like BBC iPlayer, Last.fm, NPR, ABC, Blip.TV, CNN). Companies like Boxee that aggregrate rather than segregate, will be the entertainment operations of the future.

- “For the most successful shows, video will only be a piece of the offering (coming: gaming, social interactions, mobile.)” Ronen predicts the rise of transmedia and the fluid movement of entertainment across platforms and media (we’ve already started seeing this happen around cult shows like Lost, The Simpsons, and The Sopranos).

- “Discovery of entertainment will remain mostly passive” here, Ronen points out, is an opportunity for innovation. How we discover entertainment is still largely a top-down process – how can we create apps and technologies to change that dynamic – so that audiences are discovering shows they like in an active, seamless way?

- “Audience fragmentation will grow (platforms will become audience aggregators).” With access to new and diverse content constantly growing, niche markets are becoming stronger, larger, and more myriad. Platforms are evolving into audience aggregators – taking over the role that networks once held.

- “The future of TV – there won’t be TV.” A reality that won’t come to fruition for at least several years, says Ronen, but a truth entertainment and media companies must come to terms with if they want to survive. "

Posted via web from cabezas

Thursday, May 13, 2010

Trending Google Search Topic: " HOW DO I DELETE MY FACEBOOK ACCOUNT "

" ADOTAS – Google, you always know what I’m thinking! Or perhaps I let you do my thinking for me… Anyhow, ReadWriteWeb discovered that when a user types in the words “how do I” into the Google search box, the fifth entry on the drop down list of suggestions is “how do I delete my Facebook account” (right after “how do I love thee” and “how do I get a passport,” proving that Shakespeare and trying to get the hell out of your country are still more popular).

When I inserted “how do I” on my own Google toolbar, deleting a Facebook account actually came up beneath “how do I breathe lyrics”; as someone who listens to the radio about as often as I lactate, I discovered this was a song by R&B singer Mario, who begs the question “How do I breathe when you’re not by my side?” I suggest listening to the instructions at the beginning of Bush’s “Machinehead.” You know, Liz Phair also had a breathing problem a few years back — must be a lot of asthma in the music industry. "

More at....

http://www.adotas.com/2010/05/facebooks-brewing-privacy-nightmare/#

Posted via web from cabezas

Monday, May 10, 2010