## When The AD FS2.0 Service Fails To Start (Event 7000 or 7009)

Thursday, 04 July 2013 13:04:15 CEST

There can be many reasons, and the following solution is rarely it, but when the following error does occur in the eventlog:

A timeout was reached (30000 milliseconds) while waiting for the AD FS 2.0 Windows Service service to connect.

It can be difficult to troubleshoot, and Google fails to turn up a solution. Usually, the solution is to increase the service start timeout value. Just do this:

http://support.microsoft.com/kb/922918

On the internal AD FS2.0 server, the problem usually occurs, when it takes more than 30 seconds to connect to the database server, and is mostly seen after the AD FS2.0 server has been restarted.

On an AD FS2.0 proxy, the problem usually occurs because the service takes more than 30 seconds to connect to the internal AD FS2.0 server, and is mostly seen in the last step of the AD FS2.0 proxy configuration wizard.

The error can occur on both internal servers and proxies.

## Getting Started With XCode And OpenCV On Mountain Lion

Monday, 08 October 2012 17:59:45 CEST

Ever tried getting OpenCV running in Mac OS X 10.8 (Mountain Lion)?

Plowing through all the documentation and guides can be a pretty daunting task (well, at least they exist). These simple steps will get you up and running with OpenCV 2.4.2 in XCode 4.5:

First compile, build, and install OpenCV from sources:

1. Install Homebrew. Open Terminal, and run this command ruby -e "\$(curl -fsSkL raw.github.com/mxcl/homebrew/go)"
2. Run brew install with all of {svn, cmake, ffmpeg, libjpeg, libpng }
3. Get the latest sources for OpenCV (currently OpenCV 2.4.2) here
4. Unpack somewhere and cd into the folder in Terminal
5. Run cmake .
6. Run make && make install

Next step is to create an XCode project that uses OpenCV. This is as simple as creating a new C/C++ project and specifying search path for the OpenCV headers and libraries:

1. Open XCode and choose File > New > Project > Command Line Tool
2. For the target select Build Settings
3. For Header Search Paths, specify /usr/local/include
4. For Library Search Paths, specify /usr/local/lib

In main.cpp, add something like this to test it out:

#include <opencv2/opencv.hpp>

int main(int argc, char *argv[])
{
IplImage *img = cvCreateImage( cvSize(100,200), IPL_DEPTH_8U, 3);
cvNamedWindow("Hello World!", CV_WINDOW_AUTOSIZE);
cvShowImage("Hello World!", img);
cvWaitKey(0);
cvDestroyWindow("Hello World!");
cvReleaseImage(&img);

return 0;
}


An alternative method is to install OpenCV with Homebrew, brew install opencv (not tested), but the dependencies requires amongst other things a fortran compiler, which I didn't want on my system, so I took the slightly more elaborate approach of manually installing dependencies and building OpenCV from source.

Edit:

## Sketchup@Google no more

Thursday, 26 April 2012 16:03:58 CEST

The spring cleaning that has been going on at Google is more thorough than I had anticipated. They are passing on Sketchup. Official announcement is here.

Unlike the other services Google has been shuttering, Sketchup has been a very popular product (though probably not commercially succesfull), and consquently is not being closed. Rather Google is passing on the baton to Trimble. In fact, Sketchup was so succesfull with the average user, that e.g. AutoDesk have created their own version of a free 3D modeller.

This move is quite surprising to me, because of the strategic importance Sketchup appeared to have, until now at least. Sketchup has so far been the way to create structured content for Google Earth and thereby Google Maps. With the handoff of Sketchup, Google is relinquishing total control of this content creation value chain, which suggests that they're no longer aiming for at full 3D model of our physical world.

With the development Google Maps has seen, at the comparable standstill of Google Earth, it seems reasonable that Google Earth will eventually go away, when Google Maps reaches feature- and performance parity.

## Ipad, Gmail, Exchange, and Multiple Calendars

Friday, 06 April 2012 10:30:52 CEST

If you use Google Calendar, have multiple calendars, and want to show them on an iOS device using an "Exchange" connection to GMail (provides better push support than IMAP), to use the page http://m.google.com/sync, you must be (1) logged in with your Google account in Safari, and (2) show the page in English.

I just got a new Ipad, and had to go through the usual hoops to setup mail, contacts, calendars, etc. I use multiple calendars in Google Calendar extensively, and if you connect the iPad to GMail using "Exchange" in iPad, getting to the "extra" calendars on iOS isn't as straight forward as it could be. Some googling eventually always brings up the link to choose which calendars to sync - currently, it's on http://m.google.com/sync, but for some reason I kept getting the error "Your device isn't supported" when visiting the link on my new iPad.

## Node.js to become a first class citizen on Windows

Friday, 24 June 2011 10:00:50 CEST

This is the best news I've had all week!

Microsoft and Joyent have announced that they will work towards making Windows a supported platform for Node.js

If you live in a Windows shop, it will no longer be political suicide to suggest a Node-solution. Well, at least less :-)

The event driven nature of Node.js, is just a much better solution to many problems, than the more traditional per-request threading model that many web servers employ. Handling 10.000 long running requests on IIS is no fun, but it's a breeze in Node.js

For some nails, Node.js is simply a much better hammer! That it also scales really well, is an added bonus.

At the moment Node.js solutions can really only be deployed on *nix platforms. Node.js does run on Windows, but everything about doing so is a hassle and it's Cygwin cage hampers performance. Unfortunately, most Windows shops are terrified of anything non-conformant, which often means solving problems with 'approved' tools, rather than the best tools, effectively ruling out Node-solutions deployed on anything non-Windows.

This news brought Node.js a little closer to becomming an approved tool, rather than just being the best :-)

## Lightroom and Image File Location

Monday, 20 June 2011 06:00:14 CEST

If you only use a Lightroom catalog on a single computer, the requirements for storing the image files aren't that complicated, but how to do it, if the same image catalog is to be used on many different computers?

I've previously written a long post about using Dropbox to sync a Lightroom catalog between multiple computers so Dropbox may seem like the obvious answer, but for storing actual image files, I don't think it is.

Here's how I do it. I store all originals on a NAS (with raid protection, multiple backups, etc.) on my home network, and then link to the image files in Lightroom using their UNC network path, e.g. \\\path\to\image.dng.

The advantage of doing this, in relation to the Lightroom catalog, is that this solution is platform independent. I use Lightroom on both Windows and Mac, and the UNC network path is the same on both platforms.

If I'd referenced the images using a mapped network drive on Windows (e.g. Z:\path\to\image.dng), then they wouldn't be accessible in Lightroom on Mac, as a Mac doesn't know what 'Z:' means. It would also mean that this network path must map to the same drive letter all the time, and on all Windows computers, where Ligthroom is to be used, which is also a chore.

There's a slight quirk in Ligthroom when using UNC network paths to store image files. Lightroom is case sensitive about the path, so \\host\path\to\image and \\HOST\path\to\image will end up as two different locations in Lightroom, although they're obviously not. This seems more like a bug than a feature.

By using the UNC network path, the original images are available in Lightroom on any platforms so long as this network path is available, includng using a VPN connection when I'm not at home.

As an added aside, it makes upgrading computers much easier as you don't have to safely transfer hundreds of gigabytes of image files, and you don't have to re-link tens or hundreds of folders in the catalog, because the image file location has changed. Remember when the users' home directory location in Windows changed from 'Documents and Settings' to 'Users' ...

## Lightroom and Dropbox - Here's how to do it

Thursday, 16 June 2011 06:00:36 CEST

I've finally found a simple solution to using the same Lightroom image catalog on multiple computers: Store the Lightroom catalog, settings, plugins, and previews in a folder in Dropbox, and magic happens.

The only caveat is that you must wait for Dropbox to complete synchronization when switching from one computer to another.

I've been using Lightroom for a long time. Here's why.

Accessing the same image catalog on different computers, e.g. a laptop and a desktop has always been a frustrating experience. I use just a single image catalog, and I need access to this catalog both on my laptop when I'm working with a client and on my desktop, when I'm working at home.

Note that I don't need access to the originals, I just need access to the catalog and previews, and I need to be able to update the catalogue.

I've tried many different solutions, but they've all been flawed in one of two ways. They've either been too slow, e.g. using rsync to keep multiple copies in sync, or not offered any seeamless backup protection of the catalog, e.g. storing the catalog on an external portable drive.

This problem obviously applies to most image cataloging software, like Aperture, iPhoto, and iView Media Pro (now Microsoft). The only notable exception has been Google Picasa, which for a long time has offered to sync images between multiple computers using Picasaweb as intermediary.

Here's how to do it:

1. Create a folder within the dropbox folder. I created a folder 'Lightroom' under the photos folder in Dropbox, but any folder will work
2. Copy everything from the old Lightroom folder, to this new folder location
3. Let Dropbox complete the syncronization
4. Open the copied Lightroom catalog, and verify that everything works just as before
5. Change Lightroom preferences, so that it stores user presets in a folder. Lightroom defaults to storing user presets in the user's home directory (that would be %APPDATA% on windows), but we want these available on all computers where Lightroom is used
6. Optional, but highly recommended: Change Lightroom preferences to update metadata in originals

Most people I've found writing about this suggest that you set Lightroom to delete fullsize previews after a very short time, e.g. one day, to reduce the size of the previews folder. I don't think this is a general recommendation, because it's mostly a trade off in terms of speed. Whether or not it's a good idea depends on two things: Do you actually need the fullsize previews, and what kind of Internet connectivity do you have?

If you need the fullsize previews badly enough, then you're probably willing to accept the storage overhead, and the delay caused by syncing them, and after initial sync, this isn't going to be that much anyway. On the other hand, if you don't need them, you might as well not store them, to save space and gain a little speed.

I don't need them, so I don't store them for very long, but I do set my standard preview size quite large, which actually adds up to a bigger storage overhead than keeping smaller standard size previews, and storing fullsize previews for a longer period of time.

Next up is a post on how to store the actual image files ...

## OC4J, MTOM, and the huge problem of [limited] return types

Friday, 10 June 2011 06:00:31 CEST

This one really surprised me.

Webservices implemented on the OC4J stack cannot return types under the java.* namespace, at least up to and including the current version 10.1.3.5.

This really limits the usefulness of MTOM (Message Transmission Optimization Mechanism) support in OC4J. If you need to deploy webservices on OC4J that return large amounts of data, most likely with MTOM transport enabled, your options appear to be either: don't or use JAX-WS RI.

Try to compile and deploy a webservice with a method signature like:

public InputStream getPublicationTable() throws RemoteException;


and the oracle:assemble ant task will most kindly tell you that:

Return type java.io.InputStream Can not have a value type in a package under java.*

Under most circumstances, changing the method signature to something like

public byte[] getPublicationTable() throws RemoteException;


will get you where you want to be, and if data is already binary you might as well MTOM enable the service at the same time.

The above works just fine for exchanging images, PDF files, and similar data, but MTOM enables other usecases than simply avoiding the base64 encoding overhead when transferring images and other binary data using SOAP.

On a current project I'm working on, I need to expose webservices that return data in quantities that are orders of magnitude larger than available server memory. Upwards 100Gb transferred in a single method invocation. The reasons for this can always be debated, but such are the customer's requirements.

In this context the proposed solution doesn't work. That solution requires a server capable of storing the whole byte-array in memory (it's a store'n forward pattern). The solution to this problem with the first solution is normally to introduce some form of data chunking, but that imposes some unwanted properties on the server and the caller such as state and house keeping (how far are we, and far do we have to go). And more importantly, it defies the purpose of letting the infrastructure handle these complexities in the first place.

If you're stuck on OC4J, somewhere around version 10.1.3.x and less than 11g, the only option available seems to be to hook in another WS stack in OC4J, that does support exposing streams as return types. Luckily, Oracle provides some decent information on doing so, though you will loose all the nice tooling support available in a pure OC4J stack.

If you're lucky enough to be on Metro, this is how you do it. It's also worth reading this on Stackoverflow.

## Insert document into MongoDB from Node.js

Thursday, 09 June 2011 13:51:31 CEST

This is a very simple example of how to insert a document in MongoDB from Node.js

I recently needed to use MongoDB with Node.js on a project, and finding a barebones example of how to make them work together was more difficult than anticipated.

It's nothing more than what's in the excellent Getting Started with MongoDB and Node.js presentation, but it's kind of hard to copy-paste from an image :-)

var mongo = require('mongodb');

db = new mongo.Db('mydb', new mongo.Server("127.0.0.1", 27017, {}), {});

db.open(function(err, db) {
db.collection('sample', function(err, collection) {
doc = {
"prop1" : "val",
"prop2" : {
a : 1,
b : 2
}
};
collection.insert(doc, function() {
db.close();
});
});
});


For this example to work, you need to have MongoDB installed, started and listening on the default port. You also need to add the mongodb module to your node installation (npm install mongodb worked for me).

Another good simple example of using MongoDB with Node.js can be found here. I don't find the notation quite as clear, but that's just my personal preference. It works just fine.

## Official HTC Hero ROM Update Released In Scandinavia

Wednesday, 16 September 2009 06:00:00 CEST

If you own an HTC Hero and live in Scandinavia, you'll be very pleased to know that HTC has officially released the ROM update that's been written about for some time now.

And the speed improvements this upgrade brings to the UI are unbelievable!

My choice of HTC Hero was originally an informed compromise. I knew that it would be slow(er than the IPhone), but I decided this was outweighed by it's ability to fully integrate with both Google and Exchange, as well as it's Facebook integration with contacts.

The new upgrade takes the UI speed to the same level as IPhone GS. Even with many widgets active the UI remains fully responsive, which certainly wasn't the case with the original HTC Hero firmware.

Head over to the HTC Europe Support, but download the image from HTC USA, as download speeds from HTC Europe and HTC Asia are painfully slow.

Page 1 of 5