Just a quickie for fun… what is the future of Halloween?
This article from PRNewsNow points to the ongoing trend of Halloween being more of an adult celebration than a time of enjoyment for children. Reason: The “Baby Boomers,” who are the “never-grow-old” generation, have made it such, desiring to remain young and re-create the fun they had at Halloween as children. Of course, it doesn’t hurt that they are the demographic with the financial resources to generate a market for costumes, party items, decorations, food and Halloween treats, etc. As the article states:
“When it comes to retail spending and the holidays, Halloween ranks only second to Christmas. A $5 billion industry and growing, 60 percent of consumers reported taking part in some type of Halloween celebration in 2007 and spending an estimated $1.82 billion in costumes alone, according to the National Retail Federation.”
However, the present financial crisis makes it harder for “luxury items” and non-essentials to thrive; they’re always the first to go when people are hit hard in the wallet. If this economic downturn is long-term (as it appears to be), will Halloween – and possibly Christmas – return to being a family-oriented holiday that, along with other emerging social factors, works to change the fabric of society in favor of close-knit relationships? Our holidays and traditions are often an expression of the values we hold as a society, and can in turn reinforce those values. Will changing holiday traditions restore the concept of the “neighborhood” as a catalyst for social cohesion, trust, and a stability that could transform the quality of life within our cities? Without all the adult parties taking place, the inviting porch lights could be turned on again on All Hallows Eve, welcoming children and their families to make positive and relationship-building contact with one another. (This scenario has been brought to you by the unofficial organization for global resession “silver linings.”)
The changes that we observe around us are accelerating, and in a positive feedback loop the successive cycles feed on the previous ones’ effects. The source of these changes is technology, as application of the increased knowledge we have of the world around us. As individuals, and as societies we have demonstrated to be very capable of adapting to the changes of our environment, but this necessarily has limits.
To get an idea of what this might mean for us in the future, we’ve only got to look at the best example of UGC around today: YouTube.
Blogging was great, but there appears to be far more power in a video than a long winded piece of text. Home made internet radio is pretty popular, but sadly not to the extent it could be. For this I blame the lack of microphones as standard on modern PCs. YouTube has allowed people to present themselves and their opinions in a way far more effective than has ever been seen before.
Who knows how this could evolve. Anyone can create relatively high production values given the right software. As it becomes easier to edit, present, manipulate, and even research content, more and more possibilities open themselves up to amateur creators. Professionally created material that amateurs could use in their own content, such as blue screen backgrounds, soundtracks, or special effects, could become a respectable market in a few years.
Perhaps User Created interactive experiences could have even more impact. Tools could be written allowing radical and user friendly customisation of game engines. Spore has already started to embark on this fascinating path.
The increasing richness of memorial media is a powerful by-product of accelerating change in technology, information and communication. In five years time, both broad public-facing and private 3d memorial media has a good chance of taking off, gradually catalyzing a shift in the way we interact with history and our dearly departed.
How do we properly remember and honor the dead? Our cultural answer to this question has changed over the millennia alongside with the invention of memory-enhancing technologies such as symbols, spoken language, writing, photography, video, digital information and the web.
Now the trend continues as powerful new disruptors such as social media, semantic search, virtual worlds and mirror worlds allow us to assemble, aggregate and interact with information about the dearly departed in surprising new ways.
On the most basic level, crowd-edited text-based structures like Wikipedia have already catalyzed an explosion of biographical data capture and made possible a growing niche of specialized human memorial websites.
Similarly, account-driven portals like Geanealogy.com’s Virtual Cemetery Project, MyCemetery, and World Gardens have been growing in popularity and each lay claim to being “The World’s First Online Memorial and Virtual Cemetery” or such.
In the physical world, progressive cemetery Hollywood Forever, which boasts the densest concentration of celebrity gravesites, has sparked a media memorial trend by displaying actors’ hilight reels beside their tombs. (Yes, for a pretty steep price you too can purchase your very own Lifestories Kiosk.)
TV appears to be the first casualty of this change of attitude. Simply the advent of more channels started diluting TV audiences in the late 90s. Sheer volume of choice has made people realize that they no longer need to watch what they’re told to watch, that the TV doesn’t have the same power over their lives any more. Even quality programming hasn’t saved it, as the “On Demand” paradigm has put timing in the hands of consumers.
As well as choice and control over TV content, the rise of DVD, the internet, and video games has further dispersed consumer attention. The new generations are growing up with an abundance of choice over their entertainment. This alone is causing them to demand choice and control in everything they do, something that is soon to have a major impact on our lives.
The office. It’s a dreaded workspace for many, for others it’s a grand tradition (and, for a few, it’s just a funny TV show). However you see it, the office as it exists now is evolving. Have a look at yours. Does it resemble the standard Dilbert-esque vision rife with miles and miles of identical cubicles, Sticky-Notes, and studded with those ever-flattering fluorescent tubes? Or is it simpler setup- a laptop on your lap?
These days, companies are rethinking the way we work. The new workspace, called non-territorial or non-assigned workspaces, resemble a modern version of musical chairs. Employees come to work and find their spot. This model works for Cisco Systems. At other companies, such as Bank of America, employees can reserve spaces or meeting rooms. Others (think IBM) don’t even have offices.
Mind you, the concept of the paperless office isn’t new. It’s been floating around since the 1940’s. The Atlantic featured a series on Memex machines, theoretical proto-hypertext computer systems that were to function as self-contained research libraries, in 1945. Life Magazine soon followed with illustrations. And, of course, we can’t forget gems like The Jetsons, or Brazil, or even Spielberg’s Minority Report.
Although, we’re not quite hovercraft bound, the future of the office is increasingly flexible and mobile. Employees will no longer be confined to the cubicle. The advent of wireless technologies, smartphones, teleconferencing and the Web 2.0 cloud has made the office as we know it, a thing of the past. Today, virtual is the way to go.
Nothing gets humans up in arms like a new technology. Will it cure our ills and save us from destruction? Or end the world in one cataclysmic Earth-shattering moment? Clearly, no invention has accomplished either, but try telling that to the fanatical, hysterical or just plain irrational among us. Now, with technology advancing at an ever quickening pace, rational thinking is in short supply. Here then, to prove this point, are eight of the biggest freak-out moments in technology history:
Writing Will Make us Forget – Socrates
The written word and the ability to understand it is considered one of the most important developments ever achieved by mankind and a defining step for any civilization. But not everyone was always a fan. Even that hero of western philosophy, Socrates, once argued that writing would make people lazy and forgetful!
“The fact is that this invention will produce forgetfulness in
the souls of those who have learned it,” said Socrates, “They will not need to
exercise their memories, being able to rely on what is written,
calling things to mind no longer from within themselves by their
own unaided powers, but under the stimulus of external marks that
are alien to themselves. So it’s not a recipe for memory, but
for reminding, that you have discovered.”
Sound familiar? It is the same argument that some people nowadays are directing at both Google and the World Wide Web.
Given that pretty much every major advancement subsequent to the birth of writing is built on writing itself (collectively we have advanced much faster through the use of writing) it certainly did anything but make people lazy. Forgetful? Perhaps, on an individual level. But I sure am glad Plato broke out his quill to write down Socrates’ teachings, lest I couldn’t “remember” to complain about him now.
Get Out of the Way, Here Comes the Train!
Reportedly, when the Lumiere Brothers showed their films for the first time at the Grand Cafe in Paris in 1895, audience members ran out of the room in a panic. Why? To avoid being hit by the image of a train pulling into a station!
Authority figures sure have gotten a lot smarter in dealing with public protests. In the 60’s and 70’s, public protests were greeted with iconic backlash from police and national guard alike. With the television and camera able to record these protests, they became icons for whatever movement they were fighting for.
There was the Kent State shootings immortalized by the picture of Mary Ann Vecchio screaming over the body of slain protester Jeffrey Miller. Or the famous video of police blasting protesters with fire hoses as well as sicking their dogs on high school students in Birmingham, Alabama in 1963.
It was due to these images that the traditional way of dealing with protesters had to change radically. In his paper titled “From Escalated Force to Disruption Control: The Evolution of Protest Policing,” Alex Vitale, a former consultant to the ACLU and Assistant Professor at Brooklyn College, states the following:
“Prior to the 1970’s police relied on a doctrine of “Escalated Force” in responding to demonstrations. Following numerous reports, civil law suits, and media coverage criticizing the violence that often resulted form this approach, many departments developed a doctrine of “Negotiated Management,” which attempted to minimize violence through improved communication with demonstrators and greater tolerance of disruptive activity.” -Alex Vitale.
Tactics had to change — police could no longer use any force necessary in order to quell a public protest. It’s especially true in this day and age when even videos of earthquakes are posted on the internet within minutes of their occurrence.
A great report by the ACLU (co-authored by Alex Vitale) on the protests during the 2004 Republican National Convention detail how police used mass arrests, detentions, cheap zip-ties, horse charges, intense surveillance and limited access to combat the possible threat from protesters (no one wanted a repeat of the infamous Battle of Seattle of 1999). Tactics have changed, and as a result the voice of the protester is getting fainter and fainter.
If there’s one thing NASCAR has shown the world, it’s that people will watch even the most boring “sport” on the planet in the hopes they’ll see a little blood.
The fact is, people like to see destruction. No, I’m not saying they like to watch death or serious injury, but they do enjoy dramatic destruction. Like it or not, seeing cars smash into each other at high speeds makes is exciting. Even crashing airplanes gets a good deal of attention on YouTube.
A quick glance at human history reveals that people have always had a taste for blood, from the Greeks with their Olympic Games to the Romans and their their arena gladiators.
Think about it. There’s a reason traffic slows down by an accident even though the crash has been cleared off to the side of the road, there’s a reason people crowd around a burning building, there’s a reason The Dark Knight was so popular (want to watch me make a pencil disappear?), and there’s a reason torture-porn movies like Saw and Hostel have raked in so much cash.
So what about our future sports?
We may begin to see more sports straight out of post-apocalyptic movies. With nanobots able to repair injuries within minutes and safety technologies advancing day by day, shouldn’t we expect sports to continue pushing the envelope?
Cities around the country could set up their own arenas, much like the Romans built coliseums around their empire. The Thunderdome from Mad Max could soon become a contemporary institution (in fact, real-life Thunderdomes already occur today, but are notably less deadly than the fictional kind). With such new sporting events, sports relying on violence for viewers, like the UFC, which displaced boxing, might find themselves outdated.
An interesting piece of news floating around the Internet these days is the creation of the seemingly unbelievable Amethyst Initiative. The group, endorsed by college and university presidents across the nation, aims to bring the drinking age back into the national discourse. “Amethyst Initiative presidents and chancellors call upon elected officials to weigh all the consequences of current alcohol policies and to invite new ideas on how best to prepare young adults to make responsible decisions about alcohol use.” They don’t necessarily call for the outright lowering of the drinking age to 18, but they do say that our current drinking laws just aren’t working.
While the problems attributed to the current drinking age by the Amethyst Initiative are numerous, what would lowering the drinking age do for our culture of binge-drinking?
For one thing, introducing people to drinking at a younger age would hopefully take away the entertainment of getting drunk. So many college freshmen, amazed that they can go from the home atmosphere of restrictive drinking to unlimited drinking, spend their time treating alcohol like a new amusement park ride. Going to keggers where Natural Light or Pabst Blue Ribbon flows like water (it is water), getting upperclassmen to buy half-gallons of plastic-bottled vodka for mixing with god knows what, and of course getting so drunk people have to carry you home or even the hospital.
An honest assessment of my exposure to the extreme life-extension meme.
Since being exposed to the idea of extreme life extension, which admittedly was only several months ago, I’ve found myself reacting in a more skeptical and reactionary manner than I often do when confronted with other radical new futuristic ideas and technologies. When I read about possibilities of faster than light travel, I get excited. Predictions of nano-assemblers make me hopeful. I find designs for colonies on the Moon and Mars fascinating. But when I read about trends in regenerative medicine and nanotechnology that some experts believe will conquer death, I am not enthusiastic. Instead I become very skeptical, nervous and even angry. On one level, I am surprised that I could be anything other than overjoyed that ending death could be a possibility, I very much enjoy life and, as a living organism, I have a strong instinct to stay alive. Yet I find it extremely difficult to wrap my head around the idea of life without death.
So why does extreme life extension make me uncomfortable? I’m not, nor have I ever been a religious person, though I have respect for those who are. I was raised by two atheists with PhDs in science and I haven’t ever held out hope for an afterlife. It’s not that I don’t value human life – I value it very much. As a humanist, I believe very strongly that each human life is sacred and unique and believe it is within our power, and is indeed our responsibility, to work towards giving every person as good a life as possible. I also don’t believe I am a Luddite. I am increasingly excited about technology in general, I love my cellphone and the new snazzier one I will someday get. I love my computer and wonders of the Internet. I’m fascinated by the promise of the Semantic Web. I also embrace any technology that could cure diseases or repair injuries. But when it comes to anything that may fundamentally change the way I am or the way people are in general, I am very hesitant.
I thought it would be interesting to explore some of the reactions, thoughts and feelings I have when pondering extreme life extension, as I think they probably overlap with those of the people who have been or will be exposed to these ideas.
The logic problem: Defying death seems to break down logic
When I think about the end of death, I find it hard to express myself in logical, objective terms. I am tempted to call my reactions
against extreme life extension a “bias” because there is undoubtedly an emotional aspect and I do have a predisposition against the idea. But “bias” implies an illogical perspective – can considering death a certainty really be regarded as illogical? I begin to think, “Hasn’t everything that has ever lived also died?” Well, yes, except of course for the trillions of life forms that are alive right now. So the answer becomes not “Everything that has ever lived has died.” but “Everything that has ever died, has died.” This answer is so logically recursive that it isn’t even that useful.
If the Future Centers of Europe—open, comfortable and collaborative hubs were established to encourage groups of people problem-solve, brainstorm and generally think creatively about the future of their companies or organizations. Are they an indicator of changing work attitudes and styles? See for yourself:
It is tempting, at first glance, to think of Future Centers a conference facilities or even classrooms and there is some similarity. However, Future Centers are designed not for people to merely absorb information, but rather to exchange it. They are, as the video above says mind friendly spaces for our new knowledge economy. The philosophy behind future centers is that how people think about problems and how they exchange information is essential to innovation. Future Centers seek to break down barriers of hierarchy and formality to encourage connections and the free exchange of ideas. Sound familiar? It’s the same basic philosophy inherent in the world wide web.