Sunday, November 17, 2013

Why F# Type Provider?

F# type provider is a very nice feature. I was asked several times why I need type provider? Just because of the intellisense? I believe there should be something else. In the current team, I found people "hate" type, they prefer to use C# object all the times. Now I realize why type is so important and why type can increase code quality.

I want to use a simple approval process as a sample. For my understanding, a type defines the meaning of a area of memory and certain operations which can be executed on this type.

The sample is simple. You can have something to be approved by lead, manager, and director. If a lead approves, it returns a manager type. If manager disapproves, the process will stop and director won't know it. After director's approval, this process stops. The type provider can have three types, director's approve method returns nothing while other type's approve method return higher level type. Lead's approve method returns manager, and manager's approve method returns director.

I'd bet people can say I can use a single type to do the same job. A general class can be used to create three instances. The director instance's approve method can return NULL or any special value to stop the approval process. It can solve this problem provides you know director's approval will stop this process beforehand. A runtime error is not far away if this information is not known beforehand. The type provider uses type to provide the information. After you invoke director's approve method, which returns NULL, user will know immediately that this process is terminated.

I would say in this case, type is a way to move runtime check/error to compile time. The attached is the demo project for this approval process.

Friday, November 8, 2013

Use F# to Write PowerShell Snapin & Cmdlet

When I was in Minneapolis, I was thinking to use the F# to write PowerShell snapin and cmdlet. I got the feeling that this can speed our development speed.

Because I use Visual Studio 2012 and it generates the .NET 4 binary. So I need to install PowerShell 3.0 in order to use .NET 4.0. You can use $psversiontable to check the CLR runtime version, make sure it is a number equal or greater than 4.

 namespace Log4NetPsSnapIn  
 open System  
 open System.Management.Automation  
 open System.ComponentModel  
 type Log4NetSnapIn() =  
   inherit PSSnapIn()  
   override this.Name with get() = "aa"  
   override this.Vendor with get() = "bb"  
   override this.Description with get() = "dd"  
 [<Cmdlet(VerbsCommunications.Write, "Hi")>]  
 type WriteHelp() =   
   inherit Cmdlet()  
   override this.ProcessRecord() =   

After compiling the above code, a DLL is generated. You have to use installutil.exe to add the snapin. The PowerShell snapin has Name = "aa", so you can use Add-PsSnapIn aa to load the DLL. You can also use installutil.exe /u to uninstall the snapin from your system. Make sure you open the cmd window or powershell window with administrator privilege.

  • If you are running 64-bit version machine, make sure you use installUtil.exe under C:\Windows\\Framework64\. 
  • if your OS is 32-bit, you can use installUtil.exe under C:\Windows\\Framework\

I like the way F# write PowerShell snapin. The code is concise and easier to understand.

Monday, September 30, 2013

Web Service using F#

this is a code to read file from local disk:

make sure it runs in the console application.

 open System  
 open System.IO   
 open System.ServiceModel  
 open System.ServiceModel.Web  
 let siteRoot = @"c:\myCode\"  
 type MyContract() =  
   member this.Get(file:string) : Stream =  
     printfn "Requested : '%s'" file  
     let path = Path.Combine(siteRoot, file)  
     let fileExists = File.Exists(path)  
     if fileExists then  
       WebOperationContext.Current.OutgoingResponse.ContentType <- "text/html"  
       let bytes = File.ReadAllBytes(path)  
       upcast new MemoryStream(bytes)  
       printfn "cannot find %s" path  
       upcast new MemoryStream()  
 let startAt address =  
   let host = new WebServiceHost(typeof<MyContract>, new Uri(address))  
   host.AddServiceEndpoint(typeof<MyContract>, new WebHttpBinding(), "")  
    |> ignore  
 let server = startAt "http://localhost:8081/"  
 printfn "started service..."  
 ignore <| System.Console.Read()  
 printfn "closing service..."  
 printfn "closed"  

Tuesday, May 28, 2013

SelfNote: ListBox scroll bar size

                                        <sys:Double x:Key="{x:Static SystemParameters.VerticalScrollBarWidthKey}">5</sys:Double>  
                                              <sys:Double x:Key="{x:Static SystemParameters.VerticalScrollBarButtonHeightKey}">5</sys:Double>  
                                              <sys:Double x:Key="{x:Static SystemParameters.HorizontalScrollBarHeightKey}">5</sys:Double>  
                                              <sys:Double x:Key="{x:Static SystemParameters.HorizontalScrollBarButtonWidthKey}">3</sys:Double>  

Sunday, March 3, 2013

SelfNote: PowerShell on SQL Jobs

You know what. The lack of "Start" button on Win8 makes me really learn PowerShell. I feel I am more like a Unix admin than an average user. :-D

 # disable backup job on a server  
 function Disable-BackupJob($serverName)  
   invoke-command -computerName $serverName -ScriptBlock { `  
     param($serverName); `  
     [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null; `  
     $srv = New-Object Microsoft.SqlServer.Management.SMO.Server($serverName); `  
     $jobs = $srv.JobServer.Jobs | Where-Object {$_.IsEnabled -eq $TRUE} | Where-Object { $_.Name -like '*backup*' }; `  
     foreach ($job in $jobs) `  
     { `  
       write-host "$serverName.$job is to FALSE"; `  
       $job.IsEnabled = $false; `  
       $jobs.Alter(); `  
     }; `  
     $jobs = $srv.JobServer.Jobs | Where-Object {$_.IsEnabled -eq $TRUE} | Where-Object { $_.Name -like '*backup*' }; `  
     if ($jobs.Count -eq 0) { write-host "$serverName Done." } else { write-host "$serverName failed!" }; `  
   -ArgumentList $serverName  

Saturday, March 2, 2013

SelfNote: PowerShell scripts

the following three functions are use PowerShell to set password, move cluster, and test connection to a database server.

 #set account username and password  
 function Set-Password($computerName, $serviceName, $serviceAccount, $password)  
   invoke-command -computerName $computerName -ScriptBlock `  
   { param($computerName, $serviceName, $serviceAccount, $password); `  
     write-host "on computer " $env:ComputerName "working on " $serviceName; `  
     $filter = "Name='" + $serviceName + "' "; `  
     $sqlservice=Get-WMIObject win32_service -filter $filter;`  
     $result = $sqlservice.change($null,$null,$null,$null,$null,$null, $serviceaccount,$password,$null,$null,$null);`  
     if ($result.ReturnValue -eq 0) { write-host $computerName " done!"; } else { write-host $computerName " failed!"; } `  
   } `  
   -ArgumentList $computerName,$serviceName,$serviceAccount,$password `  
 # move cluster  
 function Move-Cluster($computerName)  
   invoke-command -computerName $computerName -ScriptBlock `  
   { `  
     import-module failoverclusters;`  
     $result = Move-ClusterGroup sqlgroup;`  
     write-host "owner node = " $result.OwnerNode; `  
     $result = Move-ClusterGroup sqlgroup;`  
     write-host "owner node = " $result.OwnerNode; `  
 #test connection  
 function Test-Connection($servername)  
   $SqlConnection = New-Object System.Data.SqlClient.SqlConnection;  
   $SqlConnection.ConnectionString = "Server=$servername;Database=master;Integrated Security=True";  
     write-host "$servername connection OK."      
     write-host "$servername connection failed"  
     write-host $error[0]  

Sunday, February 3, 2013

Hadoop Day 3

I got my first map/reduce job running on the HDInsight Services For Windows. The material provided by HDInsight team is very good. :-)

The Hadoop gives a new way to store and process data. We generate huge amount of data every day and the data schema changes constantly. Every times we design a database, we assume the data does not change that often, but this is not true. The possible solution is to dump all the data with low cost. If the portion of data suddenly becomes "valuable", a HIVE external table can be created and give a schema to these data. By using the ODBC driver, the data can be copied to SQL Server which is a high cost solution storage data.  Traditional SQL Server will not be replaced by Hadoop,  it only serves for the "valuable" data. For some "useless" data, let them stay in Hadoop.

I can't publicly discuss or blog the HDInsight feature right now. So this post is to summarize the public materials.

Personally I am more interested in finding a good solution to store the data and later process it as fast as possible. I know Hadoop is ready to for "big data", so I am more interested in

1. How to link with other technologies, such as SQL server and SQL Azure
2. Move data between different data storage
3. perform my own map/reduce work

The article about SSIS catches my eye. I covers question 1 and 2. The ASV protocol mentioned in the article is the way to access Azure blob from Hadoop. If you link to my previous post about Azure blob, you can tell where that post is from. The HIVE can point to a ASV folder by using the Create External Table statement (in section 8.3). You might want to use this link to check all HIVE data types.

Once the data is organized as a table, it can be accessed by using HIVE ODBC. The ODBC enables all kinds of connections and our existing tools and skills are connected. You can NOT write data by using ODBC.

The map/reduce program is very simple, it is basically a clone of C# sample. The only problem I found is debug. The execution log from UI is not that helpful. My trick is to export debug information to the output file. Once the algorithm is correct, the debug information can be eliminated.

Saturday, February 2, 2013

Hadoop Big Data - Day2

Hadoop does open a door to so many possibilities. This blog is to store a local file to Azure blob storage. I will use Azure blob to host the data. Maybe I should call it "garbage can" as it can host any data.. :-D

Anyway, the following is the code to create a blob and create folder in the blob. The code will upload a text file to Data2 folder in the logdata1 container in the Azure blob.

 class Program  
     static void Main(string[] args)  
       CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=<account name>;AccountKey=<account key>");  
       var client = storageAccount.CreateCloudBlobClient();  
       var container = client.GetContainerReference("logdata1");  
       var fn = "Data2/TextFile1.txt";  
       var blob = container.GetBlockBlobReference(fn);  
       //upload file to container  
       using (var fileStream = System.IO.File.OpenRead("TextFile1.txt"))  
       //list items in the container  
       var blobs = container.ListBlobs();  
       foreach (var b in blobs)  

Monday, January 28, 2013

Decide to use TypeScript on HTML5 + JavaScript

I decide to give my big data project a new face. A HTML5 face, not WPF or SilverLight.

Continue from the previous post about HTML5/JavaScript, I decide to spend several hours on a new tool to write JavaScript. I compared Dart and TypeScript. As a long time C# developer, I feel it will be easier to follow another Andres's language. One advance for TypeScript can host the JavaScript code and this can help me to learn JavaScript as well.

After I install the TypeScript in VS2012, I have to reboot my computer. It was a little surprise as I did not expect a preview language needs reboot my computer while VS SP1 does not requires a reboot. But anyway, it does not take long.

I created the first TypeScript project and the sample code helps me understand what is going on. The first small project I did is a small animation on the Canvas. The following code is located in app1.ts which will be compiled to app1.js. In the HTML page, I have a canvas whose id is "canvas0".

 var c = <HTMLCanvasElement> document.getElementById("canvas0");  
 var ctx = c.getContext("2d");  
 setInterval(() =>  
   var i = new Date().getSeconds() % 10 / 10;  
   var width = c.width  
   var height = c.height  
   // Create gradient  
   var grd = ctx.createLinearGradient(0, 0, width, 0);  
   grd.addColorStop(0, "red");  
   grd.addColorStop(i, "yellow");  
   // Fill with gradient  
   ctx.fillStyle = grd;  
   ctx.fillRect(0, 0, width, height);  
 }, 500)  

In order to make the debug work, you have to use IE as default browser. Adding a new item does not work well, I have to manually add a .ts file and fix the project file.

Sunday, January 27, 2013

HTML5 First Day

My first HTML5 pages. I believe the big data needs a way to render. And I decide to use HTML5 as the way to go. The sample shows how to put script and call javaScript function.

 <!DOCTYPE html>  
 <html lang="en" xmlns="">  
   <meta charset="utf-8" />  
   <title>Hello HTML5</title>  
   <button title="Click Me" onclick="Javascript:alert('clicked')">Click Me!</button>  
   <button title="Click Me2" onclick="click2()">Click Me2</button>  
 <script type="text/javascript">  
   function click2() {  
     alert('Clicked Me2');  

Saturday, January 26, 2013

Generated Type Provider and Code Generation

Let me finish the generated type provider and 100% move to Hadoop and big data area. In previous postings (1 and 2), I explained how to write generated type provider. The following are some pain points you might encountered when write your own generated type provider.

I currently use generated type provider to generate DLL. This is my main task. Currently, there is not so many document about generated type provider available, especially how to use F# expression to generate the binary. C# provides TypeBuilder to create types in a dynamic DLL, I use F# type provider to generate DLL.

  • Define a field and property.
let providedField = ProvidedField("field", typeof<System.Func<string, string, string, obj, obj>>)  
let providedProperty = ProvidedProperty("Property", typeof<System.Func<string, string, string, obj, obj>>,   
                   GetterCode = (fun [this] -> Expr.FieldGet (this, providedField)),  
                   SetterCode = (fun [this;v] -> Expr.FieldSet(this, providedField, v)))  
  • ProvidedField is a FieldInfo, ProvidedMethod is a MethodInfo, so everywhere you can to pass in standard .NET type, you can use Provided types.
  • Constructor code needs to be modified to generate the your code. This piece needs to be added to the .fs. For generated code type provider, you do not have to return < @@ baseClass() @@ > in your constructor code.  
         let expr = pcinfo.GetInvokeCodeInternal true parameters  
         let locals = Dictionary<Quotations.Var,LocalBuilder>()  
         let expectedState = ExpectedStackState.Empty  
         emitExpr (ilg, locals, parameterVars) expectedState expr  

  • NewDelegate has not been implemented yet. The bypass is to create a module with your function and pass in the function as expression.
  • Specify the generated DLL path, where cfg is TypeProviderConfig. You can use any reflector tool to check the generated DLL.
 let providedAssembly = new ProvidedAssembly(System.IO.Path.Combine(cfg.ResolutionFolder, "GeneratedBinary.dll"))  

  • Get C# lambda from F# function. 
Very simple, you can just do System.Fun(fun i -> i+1). 

Tuesday, January 22, 2013

SelfNote: Convert rendered location to actual value

Learn something new everyday. :-)

Today I got a solution to get the rendered location for an object which has be RenderTransform-ed.

The trick is to use MatrixConverter to perform the conversion. 

Sunday, January 20, 2013

Fakes Framework Replace Function Implementation

Visual Studio 2012 introduces Fakes Framework. Currently I need to find a testing framework and Fakes is one of my choice. I cannot find any good tutorial, hopefully this one can give a quick start.

First the C# code for testing is:

 public interface IStock  
     int F();  
     int F1(int i);  
   public class Stock : IStock  
     public int F()  
       return 1;  
     public int F1(int i)  
       return i + 1;  

After creating the Test Project from Visual Studio, you can replace the F1 implementation with a new function.

 using (var context = ShimsContext.Create())  
         System.Fakes.ShimDateTime.NowGet = () => new DateTime();  
         var dt = DateTime.Now;  
         var shim = new ClassLibrary1.Fakes.ShimStock();  
         shim.F = () => 7;  
         var x = shim.Instance.F(); //x is 7 instead of 1  

Wednesday, January 16, 2013

SelfNote: Data Transform in SSIS

I created the table in SQL and SQL Azure using the exact same script. When I tried to use data source and OLE data destination in the SSIS package, the SSIS complains about "cannot covert unicode and non-unicode". I do not really have time to figure out why this can happen. Instead, I use the DataConversion solved this problem.

Tuesday, January 15, 2013

Replication health and error

The script below is used to check the replication error.

 use distribution;  

       when COUNT(*)>0 then 'there are error for replication during the last half hour'  
       else 'no error found'  
 from msrepl_errors   
 where [time] > DATEADD(MINUTE, -30, GETDATE())  

 declare @errorCount int  
 select @errorCount = COUNT(*) from MSrepl_errors  
 if @errorCount <> 0   
       select top 2000 * from MSrepl_errors order by time desc  
    select 'NO ERROR'  

Monday, January 14, 2013

Generated Type Provider Sample Published on Codeplex

Steffen asked me to provide a generated type provider sample to help understand the type provider API change and generated type provider itself. Changeset 19125 is made minutes ago to FSharp 3.0 Sample pack. The sample shows how to write a generated type provider.

In the solution, there are three projects:

  1. GeneratedTypeProvider project is the one generating the type provider using the new API. The generated type provides only one function which return a constant value 1.
  2. ConsoleApplication2, which is F# project, references to GeneratedTypeProvider project to expose the generated type from type provider
  3. ConsoleApplication1 references to ConsoleApplication2 project to reference type from C# code.
The way I use generated type provider is to generated DLL. I will start several post to help interested user to explore this area more after my sinus infection is getting better.

Friday, January 11, 2013

Generated Type Provider API Changes

Erased type provider is not a new idea for many F# users, but the problem I am facing is to generate a really DLL which can be consumed by C# project. I use type provider as a code generation tool. If you are thinking about the same thing. Here is how to make your erased type provider to be a generated type provider. The MSDN document does not reflect the recent change in the type provider API's.

  • you need to declare the ProvidedAssembly.

let providedAssembly = new ProvidedAssembly(System.IO.Path.ChangeExtension(System.IO.Path.GetTempFileName(), ".dll"))

  • For the generated type, for example the regexTy, invoke the following statements (you have to remove the space between < and -)
        regexTy.IsErased < - false
        regexTy.SuppressRelocation < - false

Hopefully this can save you 10 minutes.. :-)