Quiet, Please!

WHAT THE WEB CAN LEARN FROM THE SOCIAL BRAIN

Remember the early days of social networks and all the hype around Web2.0, “the social web”?  What started out as a form of online activity has quickly taken over the digital world by such storm that by now it is an inherent part of it. With Facebook’s well over a billion users and smartphones in everyone’s pockets, people around the world are playing more social games, uploading more content and interacting with each other and with products on the web in an ever-increasing rate. So much digital noise around us, it really is getting harder and harder to follow, let alone find a haven of peace and quiet amidst all this noise.


So what has the brain got to do with it?

With current brain imaging techniques, researchers can track what parts of the brain are active while doing anything from watching a movie, to being voted the weakest link. In recent years, brain studies have surfaced an unexpected finding that has been rattling the neuro / psycho-logy world and has lead to the emergence of the new field of Social Neuroscience. Basically what brain scientists across different disciplines have discovered is how central social thought is to the human mind. When you think about it, it makes perfect sense. The fact that humans can collaborate with each other is what enabled us to build (and sadly also to break) great things along the history of humankind.

Without knowing what’s going on under the brain’s hood, you’d think the social nature of people is made possible simply by people talking to each other. But the social brain goes far beyond language. Language may be the direct visible channel between people for sharing ideas, emotions and instructions, but by no means is it the only, or even primary, one. Truth is, language accounts for a small fraction of human communication. The vast majority comprises of other forms of non-verbal communication that require special mechanisms in our brains. Those mechanisms are dedicated to deciphering what’s going on in the other person’s brain. We all have to be mind readers, if you will. The “social brain” is a set of specialized brain regions doing exactly that – decoding the mental states in the brains of people around us.

talking brains

The social brain has three unique characteristics that make it so intriguing. First, it is “loud” by default. It constantly chatters in its own language – neural activity (neurons “firing” electric signals). As opposed to the rest of the brain, the social brain regions are the only ones that don’t reduce their activity in rest. When a person simply lies inside a brain imaging monitor (fMRI machine) without doing anything, the entire brain becomes less active. The entire brain, apart from the social regions. In other words, we have an innate tendency to view stuff around us as social organisms. Think of how we personify everything – from gadgets to buildings, we tend to view the world as having a soul: feelings and wants and needs. That’s because the social brain doesn’t normally shut down.

And what about the social web? So much has been written about the noise it’s causing and the ADD generation we’re growing, that I won’t even get into it. From facebook to twitter, commenting to liking – the social web is everywhere, constantly loud, constantly affecting us. But it’s not all bad. If you look at noise in neural networks, noise is constantly there, and it can create progress and help overcome problems inherent to the system. Projecting that back to the online world, I’d argue that this noise in our collective social brain leads to bursts of innovation and creativity.

Too much noise
Smeltley by Hector Ilanquin

Another unique characteristic of the social brain is the existence of mirror neurons. These really are one of the most intriguing phenomena in the brain (and certainly one of the most controversial). Mirror neurons are, as their name suggests, neurons that mirror other neurons activity. Which neurons? Neurons in the brain of the person you’re interacting with. When someone you’re talking to is extremely happy – not only can you hear it in the ecstatic tone of their voice, or see it in their smile, your very neurons “feel” it and mirror the “happy neurons” in the other person’s frontal lobe. In other words, social brain activity is infectious.

And where are mirror neurons in the web? For that we’ll have to define the equivalent of a neuron, a task I’m not quite ready to take (help, anyone?). However, the results of the mirror neural activity are quite apparent to me. What are memes if not infectious social behavior? What is virality, one of the most distinctive characteristics of the web, if not people echoing others’ behavior, thoughts, emotions? As memes spread through the social web, social brain activity can infect others and spread through the collective social brain.

Kitten and partial reflection in mirror

In order to compensate for being constantly loud, the social brain has another unique characteristic without which we could hardly survive. The social brain turns itself down when something else is being computed. You’d hardly want to start personifying functions while trying to solve math equations now, would you? So when you’re trying to calculate something or focus on a hard problem, the social brain realizes it would only be interfering and therefore shuts down.

But what turns off the “social web”? Fact is, we stopped calling it the “social” web because the social layer has become as invisible as it is omnipresent. The noise is now everywhere. So how should the web turn down the volume when we’re doing other stuff? Our biggest problem is, there is no specialized mechanism inherent in the system, as there is in the social brain. Some people believe the problem isn’t technology. It’s us. But expecting people to cut themselves off is as good as expecting a neuron to stop firing. We are inherently designed to be drawn, to react.

Bum-bagRiceTerraces80420acg

Until a time where the shutting of the noise is built into the system, it really is up to you to turn it down. So turn off your phone when you’re having dinner, go outside more. The sun is shining, birds are chirping. Technology has still not taken complete control of our lives.

Advertisements

Entire History of Me

We tend to think our memory is much like a camera – that we can replay events from memory with detail, exactly as they happened. But truth is far from it. In fact, our memory is quite miserable. Studies repeatedly show to what extent. You must have noticed it too – whenever you argue with your partner about a past event and she claims things went down differently than you remember – it’s not that she’s lying, nor is she crazy! She truly recalls a different version of the events than yours. And neither are probably precisely what happened. Without getting too philosophical about it (what is truth and does it exist outside our minds?), I’ll stick to what’s known: our memory is wired so that it changes with time. Every single time we use a memory, we store it back slightly altered. You see how easily over time and many uses of the same memory, any relationship between it and the original event is loose, at best.

As an app geek and life logging addict, I’ve been using many services over the past few years to help me remember my life. I’ve been logging where I was, what I did, what I saw, how I felt – basically everything I could find a service for. The brilliant “The Entire History of You” takes that approach a step further. It depicts a futuristic world where our whole life is recorded into a chip implanted behind our ear, playable for anyone to see. Sounds freaky? It is.


With all these life-logging services, It seems we’re on the right path to this “brain TiVo” future. Still, there’s a long way to go till our entire history is logged and recorded like that.

Ta-da! Enter timehop.

 

Timehop

A year ago I started using this incredible app, and I have completely fallen in love with it since. Timehop is a super simple service that reminds you every day exactly what you were doing a year ago today. It started out as a daily email (now also available on iOS) that shows you every morning your posts & pictures from twitter, facebook, instagram & foursquare from a year ago today. As funny as it may sound to an outsider, slowly but surely it has become my favorite app of the day. It has all the ingredients that any great technology has: it makes me look forward to it (every single morning!), makes me smile, and every time I use it – it feels like magic. And as every good technology, it rewires my brain with every single use.
Wait… WHAT?!?

How the brain stores memories is one of the toughest questions regarding the human mind. What we undoubtedly know, however, is how fallible it is.  To top it off, by now we know that technology – and specifically Google – rewires the memory process (and I’ve even written my take on that). Not only do memories deteriorate over time, but they are also frequently recalled erroneously. Now imagine, once a year you give your memory a boost with some real past events. A reality check, if you will. Such a simple and elegant way to better your memory using technology. I wonder if the timehop guys knew what they were onto 🙂 

 

Timehop

With timehop I discovered that “less is more”. By stripping down remembering to a push reminder about a single day, 365 days ago, the timehop team has succeeded in distilling that ever elusive enormous digital memory of our lives into an easily digestible, fun and current manifestation of it. The app doesn’t require too much of your attention, the interaction with it is very limited. When you open it you know it won’t take you longer than a minute or two before you’re on to the next thing. It’s perfectly suited to our information overloaded ADD lives. And that’s the thing about technology. Making it fun and easy is no less important than making it work and do amazing things that are “indistinguishable from magic”.

So before we have that all-encompassing, omniscient machine – a stripped down version is a great substitute. And perhaps even better. Who can tell what effect this total recollection we see coming upon us have on our lives? As Borges notes in Funes the Memorious”to think is to forget”. It is forgetting, not remembering, that makes us human. What narratives will we build in our lives if, like Funes, all we have are details? An entire history might simply be too much. So kick back and watch your history on timehop – trust me – only a few pictures and words are needed. Nothing more.

What’s to Know?

How knowledge is changing, yet again

I always had this romantic notion that my library gives the walls of my living room a dusty intelligent look. That huge pile of covered paper used to be my pride and joy. I meticulously collected fiction, science, philosophy, poetry and art books and thought of it as a window not only into my education, but to my soul. Today, I shamefully admit I read much less than I used to. I focus on non-fiction works, and usually read about 7 of them in parallel, most of which I never finish. Today, my library is a graveyard to the old world I come from.
But this post is not about books. It’s about what those books represent. I no longer read mainly because the source of my knowledge has shifted – from books, to anything online. I consume heaps of written content, but instead of flipping through paper, I now simply scroll down. And while our generation is in an in-between phase as we’re getting to see the paper to digital revolution take place, I’m sure (or at least, hope) my kids will live in a paper-less world.

 

Bookshelf

(image by trawin)

Of course, the world didn’t always rely on paper to spread knowledge. Before Gutenbreg invented the printing press, knowledge was spread in classes, in person. That’s why the invention of print was such a huge deal. It revolutionized the distribution of knowledge and created new forms of media which, in turn, created new industries and business models. Those are in the process of breaking today as we’re shifting to a digital world.
And once upon a time, long long ago, before the invention of writing, knowledge could only be documented, transferred and acquired orally. Just imagine! to learn something, people had to be in the same place at the same time. Once administrations became more complex and financial accounts and historic records outgrew human memory, writing was invented, and changed humanity forever.
Fast forward to the 21st century, and we’re experiencing another revolution. And as with the former ones, those are not only the mediums and the qunatities that are changing, and not even only the businesses and industries around those media, it’s the very core of human knowledge that is changing. What it is to know something. or to remember something. I’ve long claimed that our memory is changing as we now refer to the web as an expansion of our own memory. Recently, researchers at Columbia University have shown exactly that – our memory works differently in the age of Google.

 

And it doesn’t only have to refer to knowing (or remembering) facts or information. With facebook, twitter and the social web, how we know people is changing too. You can know someone virtually better than you know people you meet IRL (in real life). As Fred Wilson so eloquently put it: “in real life, as if there were anything other than real life“. That’s my point exactly. People keep making the distinction between real and virtual, when in fact they are one and the same. As we’re going through the information age revolution, our very minds and cognition are changing, being augmented by technology.

And what we’re seeing now is just the beginning. I, for one, can’t wait for the day I’ll finally be able to search google straight from my mind. Just imagine, information streams coming directly to your brain at will, and you sift through the results with your own neural activity. Doesn’t that make more sense than reaching out for your iPhone every time?

Let’s talk Facebook notifications

Facebook notifications are broken.
And I mean broken to the point I’m ignoring them.
Now let me step back and explain.

First, just to clarify, by notifications I’m referring to those little notes you get inside facebook, when you click on the globe icon in the top left side. As you all know, when you have new ones, their number appears in a red round badge:

Screen_shot_2011-10-25_at_10

So why are they broken?

Take this morning, for example. I’ve only been up for a couple of hours and have already received 11 of them! eleven! Which got me to look back at my notifications data. Facebook only keeps a week back of those – but from what I can tell this week seems pretty representative of what I think is going on. The past week I’ve received a total of 143 notifications, meaning an average of 20 notifications a day!

The problem isn’t only in the quantity, though, it’s the quality that counts.

I took a closer look into my notifications and realized about half of them don’t really interest me. In fact, only 78 notifications across a week were of real interest, meaning 11 a dayNow that’s more like it.

In general, there were 2 types of notifications that didn’t interest me:

  1. event notifications
  2. groups I’m subscribed to but don’t take an active part in (those groups can change depending on when / where I’m at. E.g. right now I’m in NY so all the groups that have to do with stuff that are going on back in my hometown of Tel Aviv are a waste of time).
Fb_notifications2

Getting so many notifications that are irrelevant to me, at least not in an immediate way that requires real-time notifications, is a real problem. It requires a whole lot of time and attention be dedicated to sort through which ones are actually important. And more often than not, I miss the improtant ones. It actually got to the point that I simply pay less attention to notifications as I get them. When I’m busy, I ignore them completely. And that’s sad, since some of them really are important to me. When someone comments on my post or replies to my comment – I do want to know that.

Facebook could easily restore my belief in notifications by adding settings to choose which ones I want to get. They already have that for email notifications, but why stop there?

And.. taking it a step further they can algorithmically figure out which notifications I’m interested in by learning which ones I actually click on. A rule of thumb that’s pretty apparent is that activity by my friends, on my posts, or that directly follows my posts generally interests me more than activities by strangers. As a start I’d be happy to simply get the option to choose myself though.

In the social media era we live in, we’re swamped with enough real-time distractions. I expect nothing less from facebook than to help us deal with it by sorting through the clutter.

Next time you get a notification from your first grade classmate inviting you to his garage sale in Alaska, you’ll feel it too.

How a simple click changed my surfing behavior

Being able to stay focused when we’re faced with such a plethora of online content and engagement options is virtually impossible. You all know what I’m talking about. The amount of stuff we have open on the desktop at any given point in time, all begging for and sucking our attention, is crazy (browser, IM services, documents, presentations, command line, itunes, …). And don’t even get me started on tabitis.

Tabitis_a1
Tabitis_a2

And that’s only your laptop I’m talking about. Your smartphone is “harmlessly” lying just next to it, constantly pushing you with new info about where your friends are, who just emailed you, replied to a message or commented on something you posted. Crazy. I keep saying distractions make me concentrate and it’s the only way I can work. Yeah, right… Who am I kidding? No one, including myself.

Enter Chrome for a Cause.

Install_cfac2

Like every good story, this one starts with a Chrome extension J. Chrome for a Cause is an extension Google released for the holiday season, that lets users donate their tabs for charity. For each tab you opened, Google contributed money to a charitable cause of your choice. The Chrome for a Cause campaign operated for a mere 5 days, but it took way less than that for it to seriously affect my surfing behavior. Soon after installing the app, I noticed I was becoming so obsessed about my tab count going up, that I actually started closing each tab I finished reading in order to open more tabs and donate more money. Till then what I used to do was simply google the next thing I was looking for from the address bar of the same tab I was on. Now I found myself closing old tabs and opening new ones for my next endeavour.

It soon dawned upon me that it quickly became not about the causes, but about ME – I wanted to reach the 250 tabs a day limit. I wanted to prove to myself I was a heavy user. Myself! Not even others (though I was so infatuated by the extension that I helped promote it J

Chrome_for_a_cause_a1

And while deep inside I knew it was more about me than the actual causes, knowing that something good comes out of this crazed state I drove myself into was satisfying. As Majento so eloquently put it:

Chrome_for_a_cause_a3

The tab closing behavior became so engraved in my brain that I keep on doing it today, a few weeks later. Yes. It’s that easy to change behavior. I mean, seriously! A couple of days of tab counting and I was hooked (Or… it may just be the brilliant product designers at Chrome that made my tab closing so damn sticky).

Being the rational person I am (even though the behavior I described above can hardly attest for that…), when the campaign was over and I noticed my tab-closing stuck around, I started mulling it over. There are a few advantages to opening new tabs – especially memory related (I love Chrome but it can easily get to 2G memory with the amount of open tabs I have). I can think of a few other excuses to close tabs, but that’s all they are – excuses. Being well acquainted with some other unbeneficial behaviors that are promoted by gamification, I realized I was only trying to rationalize something that’s inherently irrational. Much like people playing Farmville. Once you start spending time, let alone money, on virtual sheep you have no choice but to tell yourself it’s rewarding, or the money you had already spent on it was simply wasted. So you go on and spend even more money! Ridiculous? The hundreds of millions of dollars Zynga is making suggest otherwise… 

So instead of trying to feel good about the crazy competitive gamer within me, I decided to focus on what good can come out of it. If it’s so easy to change surfing behavior, why not use the same mechanisms to drive a positive change.

Lightbulb2

And one of the most burning problems to tackle would be online attention. Instead of rewarding users for opening tabs, let’s reward them for keeping their focus. Not checking their facebook / twitter accounts every half hour. Not keeping a million tabs open and never going back to them. And if we integrate that into a tasks app, then we can even track people for their progress and award them for that too.

And online focus is a small problem. What about bigger ones? I’ve recently been reading Clay Shirky’s Cognitive Surplus which opened my mind to the incredible possibilities the human kind has in the connected age we live in. If we correctly use gaming mechanics to seamlessly integrate value production into everyday surfing in a fun, simple, way, we could well be building the next Wikipedia with a simple click.

Game Your To-Do List

Using gaming mechanics to manage your time

As any freelance, self-employed, or anyone working from home knows, managing your time and getting things done in general doesn’t always come easily. Especially not to people like me who always do everything at the last minute. Mind you, I usually spend 20% of the time others do and get the job done, at least as good as them. Working under pressure gets the best out of me. The clearest evidence for this must have been my master’s thesis. I was writing it in August 2006, in Edinburgh. And Edinburgh in August is festival time.

Edinburgh_festival

The city is filled with artists, actors, dancers, street performers, parties, and the biggest festival crowd in Europe. It’s total mayhem, but in a good way. August 2006 was the last month before my thesis submission. My classmates were all past a couple of months of hard work whereas I was way behind in my writing. In fact, I believe at that point I haven’t even started the actual writing. BUT it was festival time. I couldn’t sit and work all day in the labs when I knew what was going on outside. So – I decided each day I’m going to do something festive. Thanks to a few friends I got a bunch of free tickets and party invites (student budget, after all). And every single day I would go out to see a play, dance, walk around, see street performers and meet people from around the world. Live and breathe the festival. My friends thought I was crazy. How can you hang out when the dissertation is due in a few weeks?! well, I couldn’t not. I soon realised these breaks were the fuel that drove my work during the day. Sure, I could bum around the labs all day, OR I could work a few intensive hours knowing that I get to go out for a couple of hours and enjoy. I don’t think I was faster than my classmates, I believe I was making better use of my time, working harder while I was working, and partying as a reward.

A while ago I got to see an amazing amazing talk by Amy Jo Kim called Putting the Fun in Functional. Kim discusses how using gaming mechanics can help build better software. It’s quite an ingenious concept. Gaming mechanics are the rules behind the design of a game that allow the players to have a fun and engaging experience. Though this was the first time I’ve ever heard of gaming mechanics, I have actually been unknowingly obsessed by the field for the past year or so.

As an avid user of location based app foursquare (and one of the first Israeli users), I was amazed at how seamlessly they use simple (or as I called them “begrush” = cheap) psychological elements to make their game a hooking one. Using points, badges, titles and more in a resourceful way, they’ve made their application entirely addictive (at least for a while, but that’s another story). So when I bumped into Kim’s talk, I knew I had to dig deeper and learn more about it. So I did. I read about Zynga’s methods and legendary playbook, SCVNGR’s gaming mechanics playdeck and more. And then it hit me – I use this stuff daily.

I did it back when I was a student in Edinburgh, and I’m doing it today. As an entrepreneur working garage-style mainly from home, I’ve crafted a few methods of my own but they’re all rooted in the basics of gaming dynamics. So essentially I’m gaming myself into checking stuff off my task list. And if I can do it, so can you! So here’s my 2 cents on how to use some simple gaming rules to work more efficiently and have fun while you’re at it:

Quests – these are the building blocks many games use to give people specific tasks to go after once they enter the game. It’s a “goal interface” design, if you will. In real life, it’s simply your to do list. Each bullet is a well defined quest to complete a task.

I_heart_quests

 

Countdown dynamic – players are only given a certain amount of time to complete a task. This dynamic is for people (yours truly included) who work best when they have a deadline. Problem is that as opposed to games where you have timers, in real life promising yourself you’ll finish something by a certain hour won’t work as well. What I’ve discovered works best for me is linking the deadline to an external factor. So I’ll tell my partner, my designer, or even a potential investor I’ll have something ready for them by tomorrow. And it works. This may sound a bit crazy – holding promises to others drives me more than promises to myself. But truth is that’s one of the biggest concepts in gaming dynamics – the power of the community. People are driven to work harder when they have competition, when they can show off what they’ve done, when they get feedback. So why not embrace it and let it help me in real life?

Achievement dynamic – get a reward for achieving something. In our case – give yourself a reward for achieving the quests you set out for yourself. The crucial thing here in my opinion is to set the reward in advance. If I tell myself I’ll get to go see a movie when I finish writing a document, I’ll finish writing that document in time to catch the movie. Of course you can’t go to a movie every time you finish something. Simply taking a break also works. Try saving the big rewards for bigger quests, or for a few consecutive ones (this is actually a separate dynamic, “chain schedules”). Also, try to set the rewards not only according to the quests, but also according to your own mental state. I, for one, know that I hate working in the afternoons. Mornings and evenings are my time. So I might schedule a reward in the form of afternoon coffee with a friend to bypass that.

Progression dynamic – this dynamic measures the player’s success and displays it granularly. Make a schedule, divide it into tasks. Ticking off tasks is seeing progress. And if you visualize it like a progress bar, that works even better (just think of the linkedin progress bar)

Task_progress_bar

Linkedin_progress_bar

There are quite a few more, but the main thing here is to acknowledge we all need some help. And as Woody (or Larry) say – Whatever Works.

As an end note, examining the difference between gaming and real life I notice how many of these dynamics involve a community. From leaderboards to feedback, the community is often what gets us going in a game. Just imagine where we’d get if we could build a game-like community around our work tasks, or our life in general. Gamify life!

 

Memory – meet Google

OR: The Emergence of Referential Memory in the Google age

“Oh yeah, that reminds me of something I read, it was, what’s his name, during the 80’s, no, the 90’s. hmpf. Wait, I’ll google it”. How many times have you heard that? Felt that? Once a day? Once an hour? People are saying Google is ruining human knowledge. There’s constant talk about how the information age is changing, even harming, our brains. I believe by now it’s safe to say that the age of information not only is light-years away from the age of knowledge, but it also inherently moves it even farther away. As access to information becomes effortless, we no longer need to know things, but only how to find them. So what’s happening to our brains in the process?

What is Memory?

Traditionally, psychologists divide the human memory into two main types: implicit and explicit. Implicit memory refers to stuff we know but cannot quite point to how we know them or what it is to know them. The myth of the coca cola ads in the 60’s is a great example for that. Coca cola allegedly used what the psychologists call priming to raise awareness to their product. Embedding split second displays of a product into movies makes people think of it even though they couldn’t explain why as they wouldn’t remember seeing the images. But there’s no need to go to the extent of these subliminal messages. We are constantly swamped with massive amounts of information that we’re unaware of, yet that changes the way we perceive things. Whenever we read a newspaper, watch a movie or even talk to a friend, we’re presented with stereotypes and even prejudice that affects our judgment. Our brain is constantly being rewired with every stimulus we are exposed to, we can’t help it.

A different, more common type of implicit memory is procedural memory. Being the reason behind the coining of the phrase “It’s as easy as riding a bike”, procedural memory refers exactly to those types of actions and procedures that are engraved in our motor memories. These memories do not require active retrieval – i.e. remembering – but rather feel inherent in our very organs. Much like priming, tying a shoelace or driving a car do not necessarily come into one’s full awareness.

Explicit memory, on the other hand, refers to those memories whose retrieval is entirely conscious by nature. Think of any historical fact you’ve once studied and are trying to remember. A story your grandma told you as a kid or the name of a person you had just run into. These are all stuff that are stored in your brain and in order to retrieve them you use your explicit memory.

Computer_brain_white

Enter a New Kind of Memory

Until recently, this historical division of explicit and implicit memory seemed to pretty much capture what was going on in our brains. These days, it seems to be out of date. As we are always online, we’ve become accustomed to use Google as an extension to our own explicit memory. Instead of only searching our brains for facts, we have an infinitely growing database of facts being indexed by Google at our immediate disposal. Mobile web or laptop, home, work or on-the-go, Google is always at our fingertips. And how do we search this enormous memory-base? As I see it, we do so using our procedural memory. Think of how you Google things – there’s usually a certain code behind what you’re looking for that you have to decipher, extract the keywords out of, and search for. Say I was looking for a talk I saw by someone whose name I can’t remember, but I can remember he was a professor at Carnegie Mellon who spoke sometime this year about gaming and design. All I had to do was search for exactly that, and the 3rd result I got was Jesse Schell’s amazing talk about gamifying life. Now, if I track back my actions I notice that this extraction of keywords and Google search came to me automatically. I didn’t have to contemplate but just search. Worst case scenario would be I see something weird in the results page and search again. Searching became automatic, much like riding a bike. So we’re using our procedural memory to search our augmented, Google based, explicit memory. And this is what I call Referential Memory. Much like a reference in a book, our brain holds an internal reference that points us to external content, in this case – web content. Referential memory is simply using our procedural memory to extract facts from our Google augmented explicit memories.

What next, you ask? How will referential memory change us as individuals? How will it change human kind? Well, whether be it a worrying decline in creativity, or the rise of collective intelligence, the implications are immense and can potentially altar society, people, intelligence – the world as we know it. But that’s a whole different story. For now, I’m merely hoping for some hard rock brain science type of evidence that referential memory actually exists and is not a fiction of my imaginative mind.