C# / Tfs-API - Weird Parallel.Foreach error: Collection was modified; enumeration may not execute


I know that it is not possible to add or remove items from a collection while iterating with a foreach-loop, by using ForEeach() in List<T> or Parallel.ForEach(). That is not what I am trying to do.

What I want to do:

I want to iterate over an array of TFS-WorkItems and create a copy of each item.

The code works fine if it is not parallelized.

Whats weird about it:

An System.InvalidOperationException with the error message

Collection was modified; enumeration may not execute"

is thrown if I want to execute it in parallel.

But not always, sometimes the code executes just fine in parallel. But I couldn't figure out a pattern...

The code:

    public void Clone(string area, string sourceIteration, string targetIteration, bool includeSubIterations)
        WorkItemCollection wisToCopy = getWorkItems(area, sourceIteration, includeSubIterations);
        IEnumerable<WorkItem> wiToCopyList = (from WorkItem mItem in wisToCopy select mItem).ToList();
        internalCloning(wiToCopyList, targetIteration);

    private WorkItemCollection getWorkItems(string areaPath, string iterationPath, bool inlcudeSubIterations)
        if (inlcudeSubIterations)
            return _wis.Query(@"SELECT [System.Id] FROM WorkItems WHERE [System.TeamProject] = '" + _prj + @"'   AND  [System.AreaPath] UNDER '" + areaPath + "' AND [System.IterationPath] UNDER '" + iterationPath + "'");
        return _wis.Query(@"SELECT [System.Id] FROM WorkItems WHERE [System.TeamProject] = '" + _prj + @"'   AND  [System.AreaPath] UNDER '" + areaPath + "' AND [System.IterationPath] = '" + iterationPath + "'");

    private void internalCloning(IEnumerable<WorkItem> cloneBatch, string targetIteration)
        var po = new ParallelOptions();
        // if i put'1' here everything works as expected
        po.MaxDegreeOfParallelism = 4; 
        Parallel.ForEach(cloneBatch, po, wi =>
            WorkItem copied = wi.Copy(wi.Type, WorkItemCopyFlags.CopyFiles);
            copied.IterationPath = targetIteration;

            copied.State = wi.State;

As you can see, the code is pretty straight forward. I'm not even accessing the collection inside of the loop. I've tried a lot of different things, creating a new List, using Classes of the System.Collections.Concurrent namsepace etc. But I just don't get where the List gets modified. (Or is it even my list that throws the exception?)

I hope someone can figure this out, because it would improve the execution time drastically.



As i mentioned: I already tried creating a new List before iterating. Parallel.ForEach(cloneBatch.ToList(), po, ... --> Same result.

Additionally, I've updated the code so you guys can see where the IEnumerable comes from.

Edit 2:

If i leave the copied.Save() statement out, I still geht the exception


I tried using .PartialOpen() as soja noted but was unsuccessful. However, using .Open() DID work.

List<string> projectList = ConfigurationManager.AppSettings["Projects"].Split(',').ToList();

foreach (string project in projectList)
    Uri uri = new Uri(ConfigurationManager.AppSettings["TFSURI"] + "//" + project);

    TfsTeamProjectCollection tfsTeamProjectCollection = new TfsTeamProjectCollection(uri);
    WorkItemStore workItemStore = tfsTeamProjectCollection.GetService<WorkItemStore>();
    WorkItemCollection workItemCollection = workItemStore.Query("SELECT * FROM WorkItems");

    Parallel.For(0, workItemCollection.Count, (i) =>
       WorkItem.ProcessWorkItem(project, workItemCollection[i]);

Note that WorkItem is a seperate class designed to process a single work item.

Before running the parallel operation, try iterating through cloneBatch and running the OpenPartial() method on each WorkItem. That's it.

I was having this same problem while walking through the revision history, not modifying a thing. I think that accessing the history must modify the WorkItemCollection somehow, which causes the exception. In any case, running OpenPartial on each WorkItem, even though it takes more time and uses up more memory, fixed the issue for me.

Need Your Help