Webstractions - Web Development & Design News
Commentary on new events and information concerning web development, design practices, search engines, SEO, tools, news story headlines and what's new at WebStractions.
The Great Firewall of China problem is ongoing. It only applies to the viewing of blogs at Blogspot, but apparantly does not affect the maintenance of those blogs. Blog maintenance is handled via the blogger.com domain.
Blogspot.com is not the only domain affected in China, Wikipedia and Google (intermittently) is blocked as well. You can check if your domain is blocked with a real-time tool at greatfirewallofchina.org, which has a test server stationed inside of China. However, the tool may report sites as being blocked when there may be a technical reason, such as unavailability.
The censorship methods used by the Chinese government are becoming more sophisticated, more refined and more extensive every year, involving an increasing number of local as well as foreign parties in their system.It is puzzling why the Chinese will let their people create and maintain a blog anonymously, but block them from viewing that blog. Be that as it may, a resourceful Chinese individual created a loophole, www.adoptablog.org, in which you can adopt a Chinese blog to help keep these bloggers online - anonymously.
According to state media, by the end of 2006 there were 20.8 million bloggers in China. Blogging, which implies venting your own opinions, has become immensely popular in China. In order to control the phenomenon the government wants blog users to register under their real name.
Bypassing the Chinese firewall is being approached from within and outside China by academics, security experts and hackers. Western academics came up with some promising ways to circumvent the firewall, but it may be a matter of time before the Chinese Government will counter those measures. Plus there is the question of whether or not the average Chinese citizen has the technical expertise to find a method (tunneling, anonymous networks, ignoring reset protocols) and apply it.
For now anyway, it appears that China's children should be seen and not heard.
Technorati Tags: china, chinese, firewall, great firewall, censorship, blogspot, blogger, google, proxy, anonymous, tunneling, reset protocol, networks, adopt-a-blog
Eris Ristemena, from Indonesia, was nominated for a PHP Classes Innovation Award in October 2006. Voting caught him the third place prize, being beat out by a Another CAPTCHA Project (why?) and Subversion::Dynamix. He received any book of choice by O'Reilly. Incidently, Eris' blog uses WordPress.
One usage possibility is an online post entry form, similuar to Blogger. Except you could add in the extra bells and whistles such as Trackback Pinging and Technorati Tags. It would also be nice to plug in the original article's Url into the field for Blogger Link and auto-discover the actual Trackback Url.
I will be playing around with this a little bit. Emphasis on the word "play".
Technorati Tags: php, class, ngeblog, blogger, google, data, api, authenticate, zend_gdata, zend, gdata
It soon became clear that John's Sitemap was part of the problem. It was in conflict with his robots.txt file. Skitzzo of SEO Refugee discovered the differences between a cached version (saved here also) of the file and the now drastically altered version.
Examination of the old robot file disallowed Googlebot for his archived monthlies, feeds, trackbacks, files ending with extensions of .php and .xhtml and any pages with a question mark (?).
What prompted the change in the first place? Jez found another article of John's on how to get your pages out of Google's supplemental index. At the time of the post, he had 1,790 supplemental results. After a robot file tweak, he has managed to remove 10 of those pages from the index. Good job John!
More importanty, his robots.txt tweak had another nasty side effect. Not only were pages being removed from the supplemental index, he was losing regular indexed pages as well. John had 3,190 pages in the index total. His robots.txt file effectively wiped out 340 (non-supplemental) pages and is now down to 2,840. Excellent job John!
But, John does not mention the robots.txt change in his post. Nor is there an update to the supplemental index post. Instead, he is trying to milk his secret for everything it is worth. And in John's case, he is looking for more money.
I was going to use this post to explain exactly what I did to restore my number one ranking. However, after reading Kumiko’s comments in my Taipei 101 to number 1 post, I’ve decided against it. I think everyone will agree that this kind of information is extremely valuable - some “SEO Guru” tried to take me for $4,000 by saying he knew the answer (which I highly doubt since he made no guarantee).Whether this change in the robot file was the reason for John's return to number one or if it was just the Google update process taking few days to settle down is not an issue. People will probably be debating this for weeks to come.
What is an issue is that John seems to think that he is onto something, I genuinely believe that. But I also know that John knows of how dangerous his supplemental index post is and is afraid to admit it. Meanwhile that supplemental post is wrecking Google results for everyone who hangs on John's every word -- John is not only evil, he is an egotistical bastard who obviously does not care about his readership. Grade-A job John!
I did warn people of Google's intent of cracking down on paid links and highlighted the fact of fucking with their NoFollow brainchild by offering them up as Pay To Dofollow (PTD) links may be a disaster in the making.
Google's latest in a series of algo updates, has cracked down even tighter on paid links from publishers. It appears that they are punishing them even further with ranking reductions. The easy to spot targets for rank reduction would be sections such as sidebars and footers. Ferreting out dofollow links buried inside a sea of nofollow's is without any stretch of the imagination very doable -- or at least raise a red flag.
Being so close to the main content of the page (the blog post), PTD's may require a little more tweaking on Google's part. Whether or not this was one of the factors in John's reduction is hard to say. However, there are obvious and numerous link candidates on his pages to warrant the flaccid nature of his evidently limper ranking impotency.
I would also like to mention to all of the people who where suckered into PayPal-ing for a PTD from John -- this is not a one way street. Google may be waiting for the light to change before proceding through the intersection to slap you with a penalty as well. Hopefully you will have noticed that the light has changed and not get T-boned by Google in the process.
Technorati Tags: google, nofollow, dofollow, paid, links, penalty, pagerank, serps
New Google Translate Search
As you know, I am a big Serebro fan and they are 100% homegrown Russian girls. So let's put this baby to the test. Now I know that Google's Russian translation machine is in Beta, but hey, they have it listed in the drop down menu. Right.
My first crack at this was to see if it can find the MTV Russia page where I saw some kind of online chat session between MTV and the girl trio. So I enter in "serebro MTV chat", enter. Ah, results -- two columns of them. One for the original language and the other for the translated page.
Well MTV is at the top, but it lands on the home page at MTV Russia and it does not look any different from the original. Sigh. I can see the link for Serebro though. Don't need to understand Russian if it is written in images! And if there is one language I can read, that is Serebro.
Interestingly enough, some of the other results were about unclothed celebs and live internet, something or other. And here I thought the US was the pRoN capital of the world. Boy was I mistaken, Russia has some goods on us there. And some mighty fine goods indeed.
Something seems to be broken on the new search page. If you were to use the ol' tried and true Translate Text box on the original Google Translate page, the translation is not too bad. Even for Russian. I was able to piece together parts of the conversation between the participants. Albeit, it is a tedious one. I just don't understand why this is not wired into the new page correctly.
Anyways, I just broke the tip on the end of my scissors and almost out of school paste. Probably should get up to Walmart before the storm rolls in tonight.
Technorati Tags: google, translate, russian, english, web, pages, foreign, language
Recent estimates place Firefox with a little over 15 percent of the market and Internet Explorer around 78 percent. Most of the market share that Firefox has gained on Internet Explorer came with the royalty deal that Google struck with the now for-profit Mozilla Foundation, who has 90 employees and revenue of more than $100 million in the last couple of years.
Mozilla plans to make enough money to keep growing ... Google, which, like the other search companies, is always competing for better placement on browsers. Under the agreement, the Google search page is the default home page when a user first installs Firefox, and is the default in the search bar. (Google has a similar placement with Apple’s Safari.)The transparency of Mozilla is coming into question by many critics, mostly because of the level of secrecy that has to be maintained in its arrangements with Google. That issue caused tension around getting the deal done and disclosure.
Other critics claim that Mozilla is percieved as an extension of Google:
... they note that one of Google’s growth areas, Web-based software applications, would have a better chance of success with a browser not controlled by its biggest rival, Microsoft.With money comes change. Firefox is evolving in directions that nobody would have imagined a couple of years ago.
The surge in popularity of Firefox has caused a backlash effect from Microsoft who is shelling more money into the advancement of Internet Explorer. The release of IE 7 shown remarkable improvements over previous versions and demonstrated to the Google/Mozilla camp exactly what they are capable of accomplishing.
When Firefox launched over five years ago, "it burst on the open-source browser scene like a young Elvis Presley -- slim, sexy and dangerous.", says Scot Gilbertson of Wired. But now he fears that with the "IE killing" release of Firefox 3.0 later this year will be in danger of becoming
the later version of Elvis -- fat.
Anecdotal reports of problems, from sluggishness to slow page loads and frequent crashes, have begun circulating in web forums, along with increasingly loud calls for Firefox to return to its roots. The alleged culprit: bloat, the same problem that once plagued Mozilla, the slow, overstuffed open-source browser spawned by Netscape that Firefox was originally meant to replace.The "roots" was Firefox's small memory footprint, fast load time and extensibility thru plugins. It was a roll-your-own bare bones browser. It was this root that became one of Firefox's major selling points to non-geeky computer users. But now that is coming under more scrutiny as reports from reader polls cite that Firefox's mysterious habit of gobbling up huge chunks of memory as their number one complaint.
Actual data is hard to come by, but Mike Schroepfer (Mozilla's vice president of engineering) opines that memory problems can be blamed on the users environment which is influenced by other software, add-ons, and extensions. To keep the bulk down, Schroepfer's team sets a high threshold for the addition of features. New features aren't built in unless they are useful to at least 90 percent of Firefox's users.
Despite those safeguards, some now-standard features could be adversely affecting performance.
It is a fine line that Firefox has to walk when trying to find a balance between what is perceived to be a "needed" feature and one that is not. What is too much? And does Firefox have a choice in that matter?
Firefox's page-cache mechanism, for example, introduced in version 1.5, stores the last eight visited pages in the computer's memory. Caching pages in memory allows faster back browsing, but it can also leave a lot less memory for other applications to use. Less available RAM equals a less-responsive computer.
Slowly but surely, Internet Explorer is catching up to Firefox. In IE7, Microsoft added tabbed browsing and integrated RSS support to its browser. If Firefox is going to continue to compete, it will need to up the ante, but it must do so without making users add extensions .. and possibly introduce compatibility problems.With the emerging technologies appearing on the web today, it will certainly push the limits of the new breed of browsers, Firefox and Internet Explorer. The change is inevitable and I do believe that neither will be going back to their roots and will continue to grow with the web.
Technorati Tags: firefox, internet, explorer, google, bloat, money, code, extensions
Test Blogger .. and get paid for it
From time to time, Blogger will run usability studies to make sure that they are on the right track with all the new features that are being worked on.
If playing with Blogger for an hour or so and making up to $100 sounds like something you’d like to do, sign up here.
You don’t even have to live near Mountain View, CA to participate. Some field surveys are handled over the phone and sometimes they will come to you.
After signing up, there is a five page questionnaire you will need to fill out. It doesn't take too much of your time.
Need more info, read the FAQ.
Technorati Tags: google, blogger, usability, survey, paid
Yes, I said gimmick and you will also notice that I emphasized the term "paid" in this post title too.
There are quite a few reasons why you should not buy into this scheme. Anyone who does will just be throwing their money away. This is no more beneficial to the subscriber than tits on a boar.
Let's say you "rent" your comment link for a few months. And remember, it is rent. If you don't pay it, you get evicted and NoFollow is taking up residence on your sofa.
Links will not pick up any juice right out of the gate either. Once they "stick" for any duration, then they may give you some benefit. But looking at a page with close to 200 hardcoded links without any comments, how much value will that be? And once it does stick, it will be buried in the archives and off of page one.
Comment links are not the same as site-wide links. Site-wide links will overpower your comment link without even lifting a finger. Chow's site has approximately 3,140 pages indexed by Google. By his admission, a site-wide link costs $240 a month -- do the math, that is only 7.6 cents per link. How many comments will you have to make to bring your margin down.
If you look at this gimmick in the right light, John Chow is not really selling you the NoFollow removal. You are paying him to comment. John is fully aware that you will be commenting (or not) on a daily basis just to get your link in there. This is reverse pay-to-comment mentality.
Lets talk about that plugin a little bit. This would be a first wouldn't it -- a WP plugin you would have to actually buy? This goes against the grain of WP itself doesn't it?
The selling of the plugin is far more evil than duping some of John's more ignorant readers into the link renting. And I wouldn't doubt if some pissed off blogger hacks the plugin and offers it up on one of the more popular download sites.
But John better revise that plugin to include the microformat rel="paid". After all, they are paid links and Google's Matt Cutts is looking very closely at them. That goes for the site-wide links too.
There is a mechanism in place via the Google webmaster console to report paid links. Albeit, it is not an official one and is being run through their spam reporting system. Currently the main purpose of reporting is so that Google can augment their existing algorithms.
And what does Matt think about paid links?
... link sellers can lose trust, such as their ability to flow PageRank/anchortext. Also, we’re open to semi-automatic approaches to ignorepaid links, which could include the best of algorithmic and manual approaches.
I even mentioned earlier this year that paid articles/reviews/posts should be done in a way that doesn’t affect search engines.
As someone working on quality and relevance at Google, my bottom-line concern is clean and relevant search results on Google. As such, I care about paid links that flow PageRank and attempt to game Google’s rankings.
I think I will just end this with by agreeing with Matt. 'nuf said.
In the six month experiment, the ad was displayed 259,723 times and clicked on 409 times -- a click-through-rate of 0.16%. The Google ad campaign cost €17 ($23), or succinctly put, €0.04 ($0.06) per click to potentially compromise a machine.
Had Stevens been a real-world hacker bent on installing malware on computers thru Google AdWords, instead of a security researcher -- then the results are pretty alarming.
Equally interesting however was the relationship of browser types when the click-thru rate is compared to the market share.
According to Net Applications, Firefox now holds 15.4 percent of the browser market, while Internet Explorer has 78 percent.
Having 80.5% of the click-thrus(335) in the experiment coming from IE users is very comparable to Net Applications market share estimates.
Firefox represented 12.5% (52 click-thrus). The difference in click-thrus vs. market share for Firefox tells me that for the normal public at large, discounting the large savvy base of geeks, designers and techies who use Firefox -- the stats are saying that people are just as oblivious regardless what browser they use.
I just spotted it in the Technorati WTF not more than a few minutes ago, and it is hitting the Hot List. Preliminary checking shows others are setting up profiles there, and looking at the SERPS a number of blogs are being set up as I speak.
There is also an entry in the Wikipedia announcing the contest, which started last April.
* Dates: 22 April 2007 – 30 July 2007
* Keyword: "Ngadutrafik 2007"
* Sponsor: www.masterseo.web.id
* Target Ending:Ngadutrafik 2007 is the topic of an SEO contest held by Adsense-Id Forum members. Ngadutrafik 2007 is a non-prized activity that challenges the members and Indonesia SEO professionals and amateurs to rank themselves among the major search engines such as Google, Yahoo, and MSN using certain keyword(s)
* SEO Ngadutrafik 2007 Championship SEO professionals and amateurs to rank themselves among the major search engines such as Google, Yahoo, and MSN using certain keyword(s).
One blogger with a hosted WP.com account has been suspended by Matt. Evidently a gal named Nenda ratted him out, citing abuse of the service. The controversy between the two and the aid of another blogger calling attention to it, did not hurt Nenda in the least -- her blog took a steep hike in visitors during that period
What gets me, is if these people are SEO's -- don't they know about Nofollow? Or will Nofollow really matter. They are hitting Technorati, which is nofollow. Some of the blogs they are setting up, they are actually commenting in. All of which have nofollow links. At any rate, this contest may just show us how effective (or ineffective) it is for combatting spam and whether it will curtail it or not. This will be a great opportunity to see them all out in the open like this.
In the meantime, I suggest you keep an eye on your commenting areas, forums too. It may be time to batten down the hatches before the main force blows ashore.
The Text Link Ad Calculator is a slickly packaged Web 2.0 device that will open up many of the secrets behind selling or buying links. The creators of the tool provide a descriptive essay on how this tool actually operates. Pretty unusual for a tool of this magnitude.
In light (or wake) of Matt Cutts announcement that Google will be taking a closer look at paid links, this is a bold and provocative move from TextLinksAds. I am sure they have done their homework on this subject and they may reap some benefit from it. The intricate work to develop and implement the tool had to weigh heavily on their minds.
Now it is here, lets sit back and see how it plays out.
Looking out the window
I was never privy to the the inner workings of selling links. But I had a good grasp of most of the mechanics behind it. There were certain intricacies that I felt were important, but could never get a clear understanding of what they were.
This calculator reveals what the designers think are important areas of the page. Most of those areas are generally understood to be important, by me, and the ways from others around the SEO area. It is not written -- it is perception.
Taking the perception of link placement and representing it an intuitive manner to the masses will have a great impact. The graphical interface with instantanial reporting only solidifies the proposition. If this is Web 2.0 and the direction it is headed, then they are on the right track.
VIA: Blue Sky Brothers,LLC
Google Analytics Gets Major Overhaul
Here are some of the improvements:
- Email and export reports: Schedule or send ad-hoc personalized report emails and export reports in PDF format.
- Custom Dashboard: No more digging through reports. Put all the information you need on a custom dashboard that you can email to others.
- Trend and Over-time Graph: Compare time periods and select date ranges without losing sight of long term trends.
- Contextual help tips: Context sensitive Help and Conversion tips are available from every report.
Jeff Gillis of the Google Analytics Team says:
Since Google Analytics launched in November 2005, the demand for website analytics has increased significantly. Today there are hundreds of thousands of Google Analytics customers, and web analytics has moved from being a niche function to becoming a mainstream aspect of the business for companies of all sizes. You've asked that we focus our engineering efforts around maintaining the sophistication and features that experienced users want, while also making it easy for both experts and non-experts to quickly and easily find the answers you want.
The one hour broadcast will take place on Webex. Scheduled panelists include Jennifer Su, Sunil Subhedar, Mike Deeringer and Laura Chen. Topics include using channels to track your ad performance, optimizing your ad placement, design, & layout, noticing trends & making proactive improvements to your site, and keeping your account in good standing.
Just Say No to Nofollow
"... a lot of SEO types were posting about nofollow again. The new twist is they’re trying all sorts of plugins and gadgets to selectively pass or bar following links from their blogs for PR.
People, this is getting really old. And really stupid. Just turn the damn thing off already."
I am going to have to agree with Tom on this. It is really stupid.
Also, the whole Wikipedia decision to add Nofollow to outbound links is stupid. And Andy Beal's campaign to Nofollow Wikipedia is stupid, even though it looks like he is Following their lead when it comes to trackbacks and comment links. Not quite calling the kettle black, but close to a very dark grey.
Carsten Cumbrowski opines:
The hope is that the return for spamming Wikipedia will be so low that it does not even make any sense for those spammers that don’t need much return to be happy.
You can live perfectly fine in India for $1 a day for example. If Spamming Wikipedia reduces that down to $0.25, the spammer will probably look for other targets. And those other targets will also go away eventually, but that is a complete different story.
Well isn't that special! Keywords here are "other targets" (eg: You, Me and Dupree). And just how will those other targets eventually go away?
And why is it a completely different story? It is the story. Wikipedia effectively kicked the spammers out of their yard and they are coming to a neighborhood near you. Is it just me, but how does this combat spam?
Its okay to follow NoFollow. Follow?
When Google first suggested Nofollow back in 2005, it didn't take long for a spec to be drafted up by Technorati. MSNSearch and Yahoo!Search jumped quickly on board supposedly, along with scores of blog software developers and proponents.
The spec abstract has nothing to do with not following a link, just applying no weight to the link itself:
By adding rel="nofollow" to a hyperlink, a page indicates that the destination of that hyperlink SHOULD NOT be afforded any additional weight or ranking by user agents which perform link analysis upon web pages (e.g. search engines). Typical use cases include links created by 3rd party commenters on blogs, or links the author wishes to point to, but avoid endorsing.
When the big three of search come out and say "they will respect Nofollow", what does that mean really? After reading dozens of comments to this regard, I am under the impression that spiders can and will follow the links if they so choose ... they just will not apply any weight to the link. A link is a still a link then.
Is SPAM the cholera epedemic of the Internet?
Should we burn down an entire village of Wikipedia huts just to eradicate a plague, only to have that plague show up in somebody elses village? Where does it stop? I think with all the Doctors and Chemists at Google General, they can come up with a better cure than this.
My hat goes off to Slashdot who use heuristics, karma and other factors in combination with nofollow to combat spam. Also, to Blogoscoped for their "fading nofollow" policy. And to other like minded bloggers who are just saying NO to NoFollow.
This is the kind of responsible forward thinking that we need to be doing, not to mention doing the SE's job for them to boot. It seems to me that any SE should have the ability to differentiate between a blog post and a comment and not apply too much weight to the comment anyway. Why should we have to tell them this?
Meanwhile, Microsoft and Cisco are investigating the possibility of establishing server farms in Iceland powered fully by renewable energy. The power comes from both geothermal and hydroelectric sources and is so cheap that in the wintertime some sidewalks in Reykjavík and Akureyri are heated.
Google is also building a massive server farm near Eemshaven in the Netherlands, where 100,000 servers will have access to 30 megawatts of power, some of which will be delivered by windmills.
And what if you cant find cheap electricity? Then you have the Iowa State Senate pass a bill that gave Google a tax break on the sales tax from utility bills and a property tax break for the site of the center itself.
There is not just the cost of electricity that is an issue here. Larry Page cares about the polluting effect that Google may have. Apparantly the Internet is not a very clean industry, and mega-coms like eBay, Amazon, Yahoo, Microsoft, Google are partly responsible for global warming for the server farms are hardly "carbon neutral". Here we go again ... Google is evil, bad Google.