Because you haven't specified AsOrdered()
(or used OrderBy()
to indicate ordered query) - the parallel query is treated as unordered - and you are not guaranteed the order you get.
The result, which I unlike Mr. Servy reproduce all the time as being ordered,
I don't think is due to:
because the LINQ method has realized that you're calling Select with an identity projection, so it's just removing the projection operation entirely. If you change the test so that the project actually does something, your tests will fail, as expected.
I tested this with Select(x=>x).Select(x=>x+1).Select(x=>x-1)
and got ordered results again (.NET 6 to .NET 9).
A bit of digging revealed the difference to be using ParallelEnumerable.Range(1, 100)
instead of Enumerable.Range(1,100).AsParallel()
. In the latter case you are almost guaranteed to get unordered results with 100 items.
With ParallelEnumerable.Range(1, 100)
we are doing static partitioning (range not chunk) at the beginning of the parallel operation. We divide the 100 items into 8 partitions with 12-13 items (if we had 8 cores) in a FIFO manner.
When it comes time to merging in our specific case the current implementation of DefaultMergeHelper
just takes the partitions and merges them back together in the original order. This is again a special "synchronous" case. Haven't investigated all the code paths for the "asynchronous" case as indicated by the private members of the type.
Some demo code that illustrates the difference:
var parallelSeq =
//Enumerable.Range(1, 100).AsParallel() // 1 will be likely last element printed
ParallelEnumerable.Range(1, 100) // 1 will be first element printed with ToArray below
.Select(x => {
if (x == 1) {
Thread.Sleep(3000);
}
Console.WriteLine($"T {Thread.CurrentThread.ManagedThreadId}:{x}");
return x;
})
.Select(x => x - 1)
.Select(x => x + 1);
var arr = parallelSeq.ToArray();
foreach (var element in arr) {
Console.WriteLine(element);
}