New Civilization News: Microcontent    
 Microcontent5 comments
picture 17 Aug 2004 @ 12:53, by Flemming Funch

"Microcontent" seems to be one of the buzzwords now. So, what is that, really?

Jakob Nielsen, interface guru, used it (first?) in 1998 about stuff like titles, headlines and subject lines. The idea being that first you might see just a clickable title, or a subject line of an e-mail, that you then might or might not decide to open. So, that title needs to be representative of the full thing, or you might not click it, or you'll be disappointed when you do. Microcontent (the title) needs to match macrocontent (the page, e-mail, article).

Now, that doesn't quite seem to be how "microcontent" is used nowadays. OK, on to 2002, Anil Dash says this, talking about a client for microcontent:
Microcontent is information published in short form, with its length dictated by the constraint of a single main topic and by the physical and technical limitations of the software and devices that we use to view digital content today. We've discovered in the last few years that navigating the web in meme-sized chunks is the natural idiom of the Internet. So it's time to create a tool that's designed for the job of viewing, managing, and publishing microcontent. This tool is the microcontent client. For the purposes of this application, we're not talking about microcontent in the strict Jakob Nielsen definition that's now a few years old, which focused on making documents easy to skim.

Today, microcontent is being used as a more general term indicating content that conveys one primary idea or concept, is accessible through a single definitive URL or permalink, and is appropriately written and formatted for presentation in email clients, web browsers, or on handheld devices as needed. A day's weather forcast, the arrival and departure times for an airplane flight, an abstract from a long publication, or a single instant message can all be examples of microcontent.
Oh, and an absolutely excellent article it is. It calls for the building of a client, a program that will allow us to consume and create microcontent easily. Not just aggregate it, but allow us to use it in meaningful ways. I.e. seeing the information how we want to see it, without having to put up with different sites' different user interface quirks. Good examples he gives at the time is Sherlock or Watson on Macs. You can browse pictures, movies, flight schedules, ebay auctions and more, all from the same interface, and without having to go to the sites they actually come from. But we're still not quite talking open standards for all that.

What is needed is the semantic web, of course. Where all content has a uniform format, and is flagged with pieces of meaning that can be accessed and collected by machines. Isn't there yet. Many smart people are playing with pieces of it, like Jon Udell, or Sam Ruby. Or, look at Syncato. All stuff mostly for hardcore techies at this point. But the target is of course to eventually let regular people easily do what they find meaningful with any data that's available on the net.

[< Back] [New Civilization News]



17 Aug 2004 @ 17:42 by Quirkeboy @ : Geographic Information System..
With your addiction in finding new and innovative ways to organize information.. I thought you may find this link interesting:

This system allows you to show a wide array of information visually. Maps are connected to databases.. you choose what data you want represented. Multiple layers on a map represent different data sets.. and are all connected to the database.. When the data changes, the map changes with it. This may seem to have limited uses until you really think about it. What if you had the addresses of people who are overweight in your town.. and you also had the addresses of fast food restaurants in the area.. you could produce a map that shows the proximity of overweight people to junk food.
What Im imagining is open source data sets.. people enter their personal tastes.. their hobbies.. their eating habits.. sleeping habits .. everything!! And with these data bases you could have an open source GIS program.. anyone could access and use it over the net.. and the user could choose what data to represent on the map. You could have spatial data represented in maps.. and quanititative data shown in easy to understand bar charts.
In other words.. you may discover hidden patterns in our culture that arent readily apparent. Who knows.. maybe theres a correlation between what you ate in your school lunch.. and your annual income as an adult?
Ive seen one GIS map that showed the location of car accidents in relationship to the location of pubs in the city.. and it was VERY revealing!! Any other ideas??  

17 Aug 2004 @ 18:01 by swanny : How about....
How about a Link preview.....
It could put up a pop up or a bubble when you moused
over a link..... the pop up would contain the meta tag data
of the page and if you saw a title or some content or some
key words of interest youd probably be more likely to click the link
It would either have to be part of the browser software or a CGI
but it shouldnt be hard to do.....
they do it now sort of on pages you mouse over a link and it display a
popup of other links which is a bit useless and oversimplistic....
These sort don't seem to work properly either cause they seem to"stick" and
such at times and make your page go off like pop corn.....
no just give me one link preview per link and that would be more helpful  

18 Aug 2004 @ 16:34 by QMAL @ : perhaps add a little A.I.
How about adding a little AI? the mix, and possibly create and release some data gathering flag and tag worms, like for instance you extract sets of microdata from the macrodata and procress it in certain ways, then the system remembers that created data set, then compares that to a perdetermined set of questions, every time it created a new relationship it would then ask a new question and all the previous questions,thus creating a question.... kind of a question generator.. then when you or someone asks for that data set again or something simular the system would have already integrated that data set and cross analyzed it with all related data it could find, and create . While doing all that, it could also creating data gathering flag and tag worms and send them out with each new question it created. If they cant flag or tag they extract and capture send the info home. Having exponentially more to draw on to create the desired new micro set, thus additionaly compounding the process, while recieveing data back from the worms.. --- pre cross referencing.sort of ever self expanding with a roving periferie ,just as our own brains do, for instance in the 80's I worked on the first american made computer controled production veighicles(I am V8 Mechanic). Fixing these was process of reading a book, gather data in form of codes from the veighicle. then reading a book, Then going through a complex set of measurments circut by circut, to eventualy figure out the problem. I thought well there has got to be a shorter way. I need to see all the data at once, and its action and reaction. Well it wasn't long before we had a computer to connect to the ALDL port ( serial port) of these primative 8086 machines and see what all the sensors, and servos were doing at the same time, individualy and later on, on one screen together. At this piont the brain can easily see , oh it,s the coolant sensor , out of range. So I would go test the coolant sensor first, shortening the diagnosis/ repair process. Then I thought Well what am I doing when I look at the real time graphic, I am cross referncing the patterns and linking it with the solution direction by asking questions. I started creating a data base of customers real time engine data, along with their data, soon I started to notice trend problems in certain models and all models even the customers adding their data to the data base, by asking them questions. Well- there is no apparent correlation tion between hair color and hot wire air flow sensor failures and license plate #s. But I thought gee if I had enough data cross refrerenced I could tell the customer up front what the most probable problems were thus easing their tension over the wild card effect of having their check engine light on. People get most upset about their machines. Everyone fairly scared of tech stuff back in the day. I might also be able to tell the manufacturer about about ridiculus design problems and repair procedures that are waisting my time. If I could have got the data base to query in a automated fashion that asked for these relationships up front, perhaps sought out the data itself, even more valuable data would have emerged and could have been compounding into the system and coming up with the real time data together if needed. I realized it would have to be organized quite differently to even aproach those results.... Well I suppose thats a little more than needed to fix the car. Thats me though fix their machines , minds and sometimes, their matter... I run into the problems Ming talks about here all the time, looking at weather data ,sunspots and stuff...all kinds of things,looking for patterns ....guess we all do in one way or another. It's troublsome to wade through a bunch of data to arrive at some simple construct. Were handed a ridiclus amount of insignificant data with our microdata, not only do we have to look at it, some of it even lingers around and comes up in wierd places, even seizes bandwidth, runs off with your data and other stuff. The electrical power this waists alone is ridiculus not to mention, cpu availibillty for more constructive use and one's personal time and thought , I t would bypass the info-lag content that infests the digital byways of today. possibly rendering that content useless eventually. It would greatly amplify the colective thinking power that could occur through internet, now and in the future. So if you-and others can create what your describing it will surely change the much as internet itself perhaps .nothing like making a good tool better.. so a great project Ming, I am glad you have the concentration for it, the proccess of taking mind to the machine. Take them to matter and you'l really be cooking...... :)  

19 Apr 2005 @ 02:28 by amin davari @ : engineri
iran_esfahan_malek shahr_mofateh ST_koyeh amir arab_kocheh milad_NO 27  

29 Apr 2016 @ 05:02 by Valjean @ : CMtpBWUWxa
Dedication is a good trait for a writer to have. You have dedication to your content and the material you write. This inoifmatorn is written very well and I concur on many of your views.  

Other entries in
10 Jul 2010 @ 13:01: Strong Elastic Links
13 Oct 2008 @ 14:42: Call for Papers: (Online) Conference On Systemic Flaws and Solutions 2009
25 Oct 2007 @ 21:47: Static or dynamic web metaphors
28 Mar 2007 @ 05:36: The Tyee - Vancouver's Online Newspaper
11 Jul 2006 @ 15:12: Response to Josep L.I. Ortega's Statement for Unity of Action
25 May 2006 @ 10:14: Squidoo lenses
8 Apr 2006 @ 23:44: Web2.0
10 Jan 2006 @ 22:55: Agora and Antigora
14 Dec 2005 @ 15:15: Ruby on Rails
19 Nov 2005 @ 14:12: Saving the net from the pipe owners

[< Back] [New Civilization News] [PermaLink]?