Pages

Saturday, February 2, 2013

Hadoop Big Data - Day2

Hadoop does open a door to so many possibilities. This blog is to store a local file to Azure blob storage. I will use Azure blob to host the data. Maybe I should call it "garbage can" as it can host any data.. :-D

Anyway, the following is the code to create a blob and create folder in the blob. The code will upload a text file to Data2 folder in the logdata1 container in the Azure blob.

 class Program  
   {  
     static void Main(string[] args)  
     {  
       CloudStorageAccount storageAccount = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=<account name>;AccountKey=<account key>");  
       var client = storageAccount.CreateCloudBlobClient();  
       var container = client.GetContainerReference("logdata1");  
       container.CreateIfNotExists();  
       var fn = "Data2/TextFile1.txt";  
       var blob = container.GetBlockBlobReference(fn);  
       //upload file to container  
       using (var fileStream = System.IO.File.OpenRead("TextFile1.txt"))  
       {  
         blob.UploadFromStream(fileStream);  
       }  
       //list items in the container  
       var blobs = container.ListBlobs();  
       foreach (var b in blobs)  
       {  
         Console.WriteLine(b);  
       }  
     }  
   }  

No comments: