Showing posts with label internet. Show all posts
Showing posts with label internet. Show all posts

2010-08-16

Life after life of Google Wave

It is a little sad, but Google ends development of Wave product as "Wave has not seen the user adoption we would have liked".

I liked the whole "Wave" idea presented at Google I/O conference. It was new an powerful tool, covering the features of multiple existing services like instant messengers, chats, forums with threads and more. On the other side, people were scared of initial learning curve, even understanding at the start what is this service for. Compare other successful service Twitter that wins with simple question "what’s happening now" and message box for simple text message. It seams that simplicity of use and ascetic features win in that case.

Wave has unique features like live collaborative editing that I like the most. This is perfect tool for quickly writing documents by collaborating team. Google Wave is open source project ,so I hope such features would be incorporated into on-line editing suites, in a way that makes collaboration almost effortless. It should also take its place on some forum based sites.

If you miss that service, look for Google Wave offspring or make your own service- the code is publicly available.

2009-12-21

Reviewing printed vs on screen documents

I have observed interesting phenomena: reviewing printed document reveals more mistakes in text than looking on the same document on computer screen. And I think I'm not alone - I've seen other people had same problem too.

It has to be something with focus on a task of document reviewing and ability to concentration in given context. Computer screen vs paper document. What is this all about? Is paper document better to read? I think it's related with casual context related with each of the media.

My computer (screen) context is something like:
  • work (reading, writing, drawing, programming),
  • news,
  • learning,
  • entertainment (movies, games),
  • sharing some functions with TV,
  • promotes multitasking,
  • contains more chaotic and less quality content that needs more filtering and preprocessing
Paper context:
  • books (facts and fiction),
  • documents (more important and less like just printed electronic versions for convenience),
  • better quality content that needs focus.
Time spent at each media is about 95% for computer screen and 5% for paper, so rare paper usage can make it potentially more interesting for brain too.

While using computer I'm doing many things while multitasking, and filtering low quality information that is reducing deep focus. Paper media in comparison switches my consciousness into context that makes me more deeply focused, less distracted (no multitasking), and and that makes me finding fine text errors easy. I don't even need to think about it, it's how my brain programmed itself in choosing proper context. It's not strange that computers (and Internet) turn me into "information trash" mode when I can't see details.

I'm not mentioning eye-strain effect, because of good modern LCD screens and readers using electronic paper so impact is small and difference between media in that area is vanishing.

Idea, how to stay focused on text without paper version of document, is to read text in simple non distracting computer environment like full screen mode, no other windows, bells and whistles (easy), and treating it like high quality content with meaning (that's harder). It may need some time to rewire brain, but probably that's the way to go.

2009-11-26

Chrome OS - the dawn of net(only)books

Google is making good strategic move to get control and share of netbooks, and maybe nettops market. The incoming era of ubiquitous mobile internet connection for everyone, makes concept of thin clients and network systems plausible. The idea probably will be mainly accepted by mainstream population (look at software usage patterns and emerging fast booting systems from BIOS on desktops for basic web/email operations). If the big G stays behind it so we can expect hardware producers answer. Looking at the Android history it can be massive adoption of "new" OS concept.

There were similiar succesful concepts like Eye OS, but I tented to use separate specific web services best suited to one task at once like email, documents editor and sharing etc. Google already has those applications, so its also reason why it could be available as Chrome OS asset.

Some experts are calling that future is only in weba pplications. But some more advanced or geeky users it could be wrong way. It's ok to have most of tools as online service, but there is still place for lower level native (and offline) platform applications. Either Chrome OS "native applications" will be sufficient then, or Chrome OS will be installed as second system on more "fat" devices.

One thing is for sure - current web applications technology is still evolving, so this is only beginning.

2008-09-15

Google Chrome - one more step on the way for perfect client for web applications

Google Chrome browser gets a lot of attention from the start. It is browser build from scratch for better performance, usability and security. Isn't it good solution for web applications, even for enterprise level? As Forrester Research analyst Sheri McLeish says in Information Week article Google's Chrome browser "is not top of mind for IT organizations". It's still in beta, and Google is the only main supporter with its unclear reasons.

There can't be "perfect" browser for everyday surfing, and in the same time using enterprise web applications. But it's good step on the way to "perfection". In worst case the best features could be adopted by other web browser players. On the other hand, if chrome goes out from Beta stage, it can take big share of users by its usability and performance.

I will be watching progress.

2008-08-12

Vacation - planning day with weather service

I am spending vacation at the polish Baltic Sea shore. The place, where the summer weather is a mix of wind, sun and rain. In that circumstances, good weather forecast service is really helpful.
I am checking weather forecasts at ICM institute site. It uses numerical prediction model for 2 day short horizon forecasts. The forecasts are detailed and precise for selected area. Unfortunately forecasts are available only for Europe. That forecast service is really good. Many times it saved me biking in the rain.
So now I am checking when it stops raining so I could walk on the beach.

2008-06-24

Day off and mobile internet

Disclaimer: That post should be published one day before, but it is today, due to unusual mobile posting problems.

I took one day off to enjoy summer in beautiful nature surroundings.
Being in possession of cell phone with some smartphone possibilities, I have decided to try out mobile internet.

Few days ago, I have turned on gprs service at my cell provider. Everyday I have been surrounded by computers, almost all connected online.

What for do i need internet on my cell? One reason are convenient online services kille weather or dining searching. The other explanation is that I am computer/internet addict.
I have tried some useful software on my nokia s40 phone.
Here is my list:
Opera mini - good for search and news browsing
Gmail mobile - explanation is not needed
Midpssh - nice ssh client - for use in critical situations
Some IM ,SIP VOIP client - still searching
The real pain is text input. It would be nice to use better method than T9

Enough. I am going for a walk

2007-12-10

Thunderbird and everyday spam fighting

How many emails do you get every day? How many of them are not spam?
I get about 200 spam emails everyday. I'm using Thunderbird mail client with built in spam filter - but it "eats" only about 15% of my spam. I think that it's not the matter of training data because of about several thousands good marked posts.

Somebody wrote about manual preselecting best representatives for the types of spam, and then feeding the spam filter with it. But come on - I need and automated method.

So I started looking for the best spam filters I can use for free. Very popular and tempting option is to forward all your emails to Google mail and use it as your mail client. Filtering capabilities are indeed very good, but what if you don't want to redirect all your email to "G empire"?

There are good open source spam filters working as proxy servers like fabulous spamassasin. It's good idea to set up such kind of spam filtering server for intranet.

I've needed something different. While searching for Thunderbird spam filtering plugins I found adn tried one interesting product - Spamato.

It can work as standalone mail proxy but I used Thunderbird plugin version spamato4thunderbird. There is also MS Outlook version too if anybody is interested.

Installation procedure is easy - like for most of hunderbird plugins. Basic configuration - show your bin/java path and give registration email (spamato can send spam statistics to central server). Oh I mentioned java. Yes it's written in java- and proces takes about 70 MB under windows, and it's not blazing fast. It's not a problem for me - I'm checking email a couple times per day and don't need to run Thunderbird all day. It's the matter of work organization.

What you get after install: a beast with pluggable architecture, a dozen filter and decision maker modules, configurable via http interface, and displaying nice charts. Oh and there is that funny (annoying) sound after every spam email marked - it's good you can easy turn it off.

Using experience- surprisingly good. Filtering results - not worse than Google mail filter. There were few false positives at the beginning (ham marked as spam), but all those filters are learning. All spam marked messages are moved into special spamato folder (check configuration). You can view spam history and correct (teach) filters via web interface. Big negative - spamato is not integrated with Thunderbird native junk mail tools. So marking something as junk under Thunderbird, doesn't count for Spamato. You have to correct filters on Spamato separately. There is partial solution for that problem - you can configure spamato to detect posts movements in/out of spamato folder on IMAP server.

So I'm going back to read my spam errrgh email inbox.

2007-09-22

Politics, sources of information and reliability network

Every day you can hear what government is doing, so you are informed, aren't you? How many people understand what is between lines of prepared and crafted by PR people speech or statement. Do you know what exactly those people are doing? Do you believe politics words?

During election campaign I want to know what are past achievements of candidates, an have a context to evaluate current plans and promises. Idea of democracy is that everyone should be able to do that.

There come government monitoring programs backed up by governmental or by independent NGOV organizations like Transparency International. On the other end we have press and media coverage of government actions. Everyday we are bombarded by vast amount of fragmented, incomplete and untrusted information. Our brains treat it as a noise.
Internet gave voice to independent single journalists. They are trying to consolidate their efforts, to be more visible and trusted source of information. You can find many sources like CyberPolitics blog tracking how the media uses the Internet and technology to cover the US presidential campaign. But for the most of citizens they are unknown and untrusted.

So I see two main problems. One is quality of information and second is reliability. I like the idea of wikipedia. Wikipedia contributors are huge companies, single persons and other organizations. They are building base of knowledge that is improved over the time. Good information quality needs processing so here pays joined contributors effort. What about reliability? The UCSC associate professor Luca de Alfaro project aim is to automatically estimate the trustworthiness of each page based on article and user edits history. It's an interesting approach by analysis of huge scale publishing. It's not perfect but better than nothing.

We have one tool, invent another and maybe something like reliability network may be established. Imagine every person or organizations acting as source of information having place in reliability network, that keeps statistics about reputation and reliability. It's not easy to make secure and hard to exploit framework but I believe it's possible. Just keep that kind of information in structured framework and run evaluation algorithms to improve information quality. So maybe next time I will be able to get consolidated information and verify it's source, for example at politics-gov-wiki site.