Import Export Fails Once Blog Posts Rows Exceeds around 700

Topics: General, Troubleshooting
May 16, 2012 at 6:30 PM
Edited May 18, 2012 at 4:47 PM

I've been using the Import Export module to import content from a WordPress site to Orchard CMS. I have a blog with almost 1700 posts.

I know the recipe XML is fine, because I broke it up into 5 smaller files, any two of which will import correctly, which results in almost 700 posts imported. Once I have this many posts, any subsequent imports fail.

The error log reveals:

Orchard.OrchardCoreException: Recipe execution with id 380005f9efab45f5890b7bcc5821d25b was cancelled because the "Data" step failed to execute. The following exception was thrown: could not execute query

The query in question is apparently:

Exception during Alias refresh NHibernate.ADOException: could not execute query

Of course, if I manually execute the query it lists it works fine. This is clearly some bug with nHybernate, database connections, and queries getting slower as the dataset gets larger.

May 17, 2012 at 7:05 AM

Did you try the BlogML module?

May 18, 2012 at 4:50 PM

I was looking at it, but I've added some extra fields for some SEO meta content. The team module can import that content, I don't see how that will be possible with BlogML. We're talking thousands of posts.

Not importing that meta content or entering it by hand isn't an option.

May 18, 2012 at 5:07 PM

After some tweaking I was able to import all your files, but not the big one itself ... as I thought it was your goal I abandoned. Though now I read your email again it seems that importing one file at a time is ok for you isn't it ? I also made a command to import those files, as I doubt doing this kind of import through http is a correct solution. I'll try again this morning and let you know how it goes.

May 18, 2012 at 6:11 PM

I was able to pull the 5 files from the command line without problem. The first one took 3 minutes, and the last one 7 minutes. Browsing those ~1700 blog posts is quite fast, showing no noticeable slow down compared to a blank site.

I am not sure the changes didn't break anything so I wont include them for the coming 1.4.2, it's too soon. Though I included them in 1.x, and I suggest you to grab those two changesets and apply them on your local website. You can also create your own fork on codeplex if you want:

Then close your website server, open a console command, and type something like this:

orchard.exe import file /Filename:"the_location_of_the_xmll_file"

Repeat it for every files.


May 18, 2012 at 6:17 PM
Edited May 23, 2012 at 8:08 PM

Yeah, I don't mind importing file by file. I'm happy you got it working though, because not only do I need to import files 3, 4, and 5 to this instance of Orchard, but like I said, this is part of a bigger effort to migrate a bunch of other platforms to Orchard, all of which have thousands of posts/pages.

I'll follow your instructions and let you know how it goes. Thanks again for your attention to this matter, your help has been invaluable.
May 23, 2012 at 8:01 PM
Edited May 23, 2012 at 8:05 PM

To follow up on this issue:

The patches Sebastian posted links to do indeed work but ONLY against SQL CE. My SQL Server still failed with nHybernate connection/timeout issues.

So, if you need to import a lot of content and run into a similar issue, try doing the following:

  1. Apply the patches Sebastian posted above.
  2. If you have 1 giant recipe file, break it up into several smaller ones (I made mine about 1.5MB each).
  3. Run/Install Orchard using SQL CE. This is the "Use built-in data storage (SQL Server Compact)" option when doing a new install.
  4. Use the command line procedure specified by Sebastian above to import your files. I would verify on the site after each import, but then I would reset IIS before the next import, just to eliminate that db connection- idk if that was necessary, but it worked.
  5. Once you have a SQL CE database filled with all your data, if you need to move it to a SQL Server database, use this tool: With that you can load a SQL CE database (a *.sdf file) and generate SQL scripts for both schema and data that can be run on a SQL Server database.