This morning a friend forwarded me this email from Mockingbird.
Dear Mockingbird customers,
Mockingbird’s taking flight
We’re excited to announce that Mockingbird will launch on August 15, 2010 — with multi-user collaboration!
Details on the paid plans:
There will also be a free account available that allows 1 project with 3 pages.
You can archive finished projects, which won’t count toward your active project limit. You won’t be charged for months during which you have no projects active.
What you need to do
Projects made with beta accounts will be automatically archived and will not be accessible after August 15, 2010. Customers who choose to upgrade to a paid plan may re-activate beta projects, but free accounts will not be able to access old beta projects, so please make sure to log into your Mockingbird account to export PNG or PDF versions of any projects you need.
Your feedback has made Mockingbird better, and we’d like to thank you by offering a coupon for 25% off the first three months of any paid plan. Keep your eye out on your inbox for the coupon code when Mockingbird launches.
Questions or comments?
If you’ve got questions about the transition out of beta or want to get in touch with us for any other reason, we’d love to hear from you. Email us at firstname.lastname@example.org.
Thanks for all your support, and we look forward to continuing to help you make great wireframes.
– The Mockingbird Team
In a nut shell they are saying thanks for testing, all projects are now archived so export them. If you want your old projects back, pay us. Wow.. Awesome. Thanks!
I jumped on Twitter, first off I never use Twitter but I figured it might be worth a try.
Wow. Didn’t expect that.
I guess I just have to tweet a complaint to get my projects extended, so everyone else that is in the same boat as me has to ask? What’s the harm in letting your free beta testers to keep their current projects? Oh… right.. Mockingbird doesn’t want casual users that could see the benefits of a quick HTML5 mockup tool so they won’t eventually become paid users and tell friends and colleagues alike how great of an application it is.
My project is exported.. who knows what they will do in the future but don’t consider your projects safe with this startup.
Like me, you’ve probably read all sorts of great things about the public DNS service OpenDNS, but one thing you should find out for yourself is how it will impact the speed of your web browsing.
After using OpenDNS name servers for months at work I started to notice that page load times at home (on a much slower connection) were considerably more “snappy” with my ISP provided DNS servers. A quick Google brought up the comprehensive Open Source namebench DNS benchmark tool.
A run of namebench resulted in a pretty html file with numbers and graphs that confirmed my suspicions. My ISP’s (Comcast) DNS servers were 174.4% faster than OpenDNS on average, with Google’s Public DNS coming in a close second place.
It should be noted that namebench is rather smart and it rifles through your browser history in order to compile a personalized set of domains for its testing, so my results are potentially unique – and this is just one test (actually I did three, I also used Alexa top domains for a source to avoid DNS caching – and the results were all similar), at one location. Although once I switched to my ISP’s DNS servers I experienced a noticeable difference in page load times. Not to mention that my results may change in the future due to server load, poor caching, etc. But with OpenDNS, there was always a slight wait before a page would even start to load, and now that wait is gone. And that’s on a 36Mbps downstream connection.
My points are simple:
Switching to OpenDNS (or any other DNS service) may be a bad thing.
Lately I’ve been loving Clonezilla for rolling out refurbed Dell workstations. It’s been really cool, boot from USB “liveCD”, clone disk to disk directly over gigabit ethernet, reboot, repeat. But after doing 10 of them, I ran into the true limitation of Clonezilla. Clonezilla relies on ntfsclone and partimage (great tools) but they share a key weakness: neither can restore an NTFS drive or partition image to a smaller target – in my case it was a matter of a dozen sectors. It’s ironic because both tools only copy the used blocks and seem to support resizing but they just plain don’t do it. Needless to say I couldn’t accept that fact until I was done pounding my head against the issue thoroughly, then I used the de facto Windows imaging tool: Norton Ghost.
So, its 4:00 AM and I’m in the lab finishing up my Ghost disk-to-disk imaging on the remaining machines…
Total time to break remaining boxes and yank HDs + Ghost imaging time = 30 mins.
Time wasted to get to this point = 3 hours.
If anyone can prove me wrong concerning the shortcomings of Clonezilla, please do (and comment, duh).
After spending 6+ hours on the phone with HP Technical Support (not an exaggeration) attempting to convince them that the issue with our customer’s HP Slimline was in fact hardware related and not a configuration problem (that’s another story), they finally agreed to replace the motherboard under warranty. When we got the computer back and read the service report we were less than pleased… Continue reading “HP Service is Scared of a little Bug”