These days I am struggling the design patterns and parallel as I claimed design pattern is not parallel friendly. I bet a lot of people won't agree with me. Give me some time and let me think through it.
Let me explain the context; otherwise, you will doubt the rationale behind my test. I found many parallel algorithm or approach is trying to partition the data. how about play function (not data) to achieve parallelism?
Since I am a fan of F#, I really hope the function way will win. Let us wait and see.. :)
the code is
- The first version is to partition data and make parallel.
- The second version is to apply a function list to the data.
static void Main(string[] args)
{
List< double > a = new List< double >();
List < double > b = new List < double >();
Stopwatch sw = new Stopwatch();
var size = 100000;
for (int j = 0; j < 100; j++)
{
var l = Enumerable.Range(0, size).ToList();
sw.Reset();
sw.Start();
Parallel.For(0, size, i => l[i]++);
sw.Stop();
a.Add(sw.ElapsedMilliseconds);
var f = Enumerable.Range(0, size).Select(i => { return (() => l[i]++); }).ToList();
sw.Reset();
sw.Start();
Parallel.ForEach(f, n => n.Invoke());
sw.Stop();
b.Add(sw.ElapsedMilliseconds);
}
a.Zip(b, (x, y) => new { x, y }).ToList().ForEach(n => Console.WriteLine("{0}, {1}", n.x, n.y));
}
First of all, why I need to run 100 times? Parallel is based on some random factor, if we only run once, the result is not accurate.
The results shows data partition version wins.. :-(
the first version is so simple, so the second version's function invoke fails me.. (sigh). But anyway I tried my luck..
No comments:
Post a Comment