Ubuntu with Visual Studio Code ARM Template

 

Setup the new cross-platform ASP.NET Core with Visual Studio Code on a Linux machine quickly? It’s a bit tedious to do all the required installation bits, not to mention to figure out the little issues.

DeployToAzure

imageUsing our new ARM-Template you can setup such Box on Microsoft Azure with a single click on the Deploy-Button (if you have an Azure Account already, if not get one here)! 

Then fill the parameters with values

  • Credentials
  • DNS-Name
  • Run full Ubuntu-Desktop? (installation will take much longer, but you can play Mahjong)
  • Resource-Group Name

and click “Create”.

The ARM Template installs:

  • Docker (from Docker Extension)
  • Ubuntu Desktop with XRDP and xfce4 (Full or Minimal)
  • Visual Studio Code
  • .NET Core SDK
  • NodeJS and NPM v6
  • Yeoman with ASP.NET Generator
  • C# Extension for Visual Studio Code in Visual Studio Code

RDPLoginLater use Remote Desktop Connection to connect to your machine! Computer:<DNS-Name>.<Location of Resource Group>.cloudapp.azure.com. Enter your credentials in the xrdp login dialog. Make sure “sesman-Xvnc” is selected!

 

VSCodeMenuYou find Visual Studio Code under Development. Or you can start it from the shell with “code .”. You also may use Yeoman with the preinstalled ASP.NET Generator.

Read more about the ASP.NET Generator at the blog of Scott Hanselman.

Enjoy playing with .NET Core and Visual Studio Code running in Windows Azure

YOGeneratorAndiP

API Management on Global Azure Bootcamp 2016

I recently had the opportunity to give an introduction into API Management at the Global Azure Bootcamp 2016 in Linz. You can find the pickings of that event here (german only). I decided to publish my slides about API Management but also some information about the demo environment I used.

Ok this turned out to be more a blog post about how to authenticate Web Apps with Web API Apps Smile

First and foremost, to play around with Azure API Management you need a Microsoft Azure Subscription, which you can get here.

My demo environment looked like this:

  • 1 developer instance of API Management managed through the classic azure portal.
  • 1 azure resource group where I run a free  App Service Plan managed through the new azure portal with
  • 3 Azure API Apps (CalcGAB2016, CalcEnterprise, CalcEnterpriseClient)
  • 1 azure active directory instance managed through the classic azure portal.

If you plan to create API Apps yourself I recommend to use the template “Azure API APP” in ASP.NET applications. This will come preconfigured with the Swashbuckle packages which allow to create an OPEN API SPECIFICATION (formerly known as swagger) document straight from your code. You can read more here about how to customize the swashbuckle  generated api definitions.

Now to my sample code. Since there is plenty of documentation on how to use API-Management (you can find an incomplete but helpful list in the last slide of my presentation). My JWT-Token demo is based on the presentation from Darren Miller (see time – 7:30).

Therefore will focus instead on some specifics in the AAD Integration of the API app “CalcEnterprise” and the web app “CalcEnterpriseClient” which I have secured with AAD.

Securing Azure Web/API Apps

I love the idea that you can move out the authentication of your application and instead just configure it on the portal. Like a former colleague of me said: You do not want a web developer to write your authentication code Smile. Instead you just write the application with “No Authentication” selected and configure the access in the new management portal:

image

Depending on the authentication you selected your ClaimsPrincipal.Current object will hold all claims provided by the authority that authenticated your visitors. Aside of that you receive the complete token and some other information about the authentication in the headers that Azure provides:

X-MS-CLIENT-PRINCIPAL-NAME=f.e. email
X-MS-CLIENT-PRINCIPAL-ID=f.e. a GUID like in AAD
X-MS-CLIENT-PRINCIPAL-IDP=Identity Provider (AAD => aad)
X-MS-TOKEN-AAD-ID-TOKEN=in AAD the JWT Token with additional claims, that you also can find in ClaimsPrincipal.Current.Claims if you happen to run the application in ASP.NET

Step 1 Configure your AAD & securing your Web APP/API

After you have created a new AAD instance like <yourChoosenName>.onmicrosoft.com you can define an AAD application which your application will be using. Under Applications Tab

  • Add new application “Add an application that my organization is developing”
  • Name it and select “Web Application and/or WEB API”
  • Provide the sign-in url which will be the URL to your website like: https://&lt;yourapp>.azurewebsites.net
  • Provide an unique ID which can be any unique URI, but for multitenant applications use the base tenant uri in combination like so: https://<yourTenantName>.onmicrosoft.com/<your unique name of the app>

After you have created the application you will find also populated the REPLY URL which is important as the JWT Token will be sent only to this URL by the identity provider! To configure Authentication/Authorization for your Web APP/API:

  • copy Client ID of your AAD Application
  • Open “View Endpoints” (button on the bottom of the screen) – Copy the URL of the Federation Metadata Document
    Open the Federation Metadata Document URL in a browser
    In the XML find the element “Entity” and copy the content of the  attribute “entityID” which contains the link to the Issuer (Issuer  URL)

You will need these two values to configure Authentication/Authorization in your WebAPP/API like so:

image

Step 2 Applying this concept to my Sample Code

I figured out that I could create at least 2 different scenarios with my two Web APP/APIs:

  • Assign Client and API to a single AAD applications
  • Assign Client and API into separate AAD applications

With the first option I can easily authenticate my call into the API from my Client with the same identity that authenticated on the client (Implemented in the HomeController “Index_SameAAD”):image

With the second option I can my Client App as Service Principal to authenticate to my API, which hides the original identity for the API.
(Implemented in the HomeController “Index”):

image

But I also can re-authenticate the identity on the API to see the original identity. I found this excellent article of Vittorio Bertocci on using ADAL’s AquireTokenByAuthorizationCode to call a Web API from a Web APP which showed me the way how to implement this.(Implemented in the HomeController “Index_OtherAAD”):

image

Step 3 – Clone the source

Feel free to clone my source code from my github repository and play with it.

You need to replace following placeholders with actual values and deploy & configure your apps to Azure of course.

  • ”<your API MGMT API key>”
  • <yourAPIMInstanceName>
  • <yourcalcEnterpriseAPIwebsitesUrl>
  • <yourAPIMInstanceName>
  • <CalcEnterpriseClient AAD CLientID>
  • <CalcEnterpriseClient AAD App Secret/Key>
  • <CalcEnterpriseAPI AAD CLientID>
  • <yourTenantName>

First restore all packages – For some reasons I had issues in the Calc-Project with not loading the correct DLL for System.Web.Http and others (funny enough it shows errors in Visual Studio 2015 but still compiles fine *lol*). Closing the Solutions and opening the Project-File instead fixes this.

Clone the Source
Download Slides

Enjoy a nice day – AndiP

Creating a JWT-Token in Windows 8.1 Phone App

I thought I quickly download the JWT nuget package to my Windows 8.1 universal app. Well I was wrong.  After some searching I found this article Creating a JWT token to access Windows Azure Mobile Services. But System.Security.Cryptography is no longer available in Windows Phone 8.1 Universal Apps. You should rather use the classes in Windows.Security.Cryptography which are of course inherently different.

So I rewrote the JsonWebToken Class to work in my universal app and share this here if you run into the same issue. I validated it with the JWT debugger on http://jwt.io/.

BTW, before you ask that question: “Why do you not use Windows 10 Universal App?” Answer: I would if the Windows 10 Preview on my Windows Phone would be in a better shape Smile. This was the first preview I had to roll back in my life.

/// <summary>
/// based on http://www.contentmaster.com/azure/creating-a-jwt-token-to-access-windows-azure-mobile-services
/// Reimplemented cryptographic part
/// </summary>
public class JsonWebToken
{
/// <summary>
/// Create a HMACSHA256 Signing HASH
/// </summary>
/// <param name="signingKey"></param>
/// <param name="bytesToSign"></param>
/// <returns></returns>
private static byte[] HMACSHA256(byte[] signingKey, byte[] bytesToSign)
{
var signingKeyBuffer = CryptographicBuffer.CreateFromByteArray(signingKey);
var bytesToSignBuffer = CryptographicBuffer.CreateFromByteArray(bytesToSign);

var hmacAlgorithm = MacAlgorithmProvider.OpenAlgorithm(MacAlgorithmNames.HmacSha256);
var hash = hmacAlgorithm.CreateHash(signingKeyBuffer);
hash.Append(bytesToSignBuffer);
string base64Hash = CryptographicBuffer.EncodeToBase64String(hash.GetValueAndReset());
return Convert.FromBase64String(base64Hash);
}

public static string Encode(object payload, string key)
{
return Encode(payload, Encoding.UTF8.GetBytes(key));
}

public static string Encode(object payload, byte[] keyBytes)
{
var segments = new List<string>();
var header = new { alg = "HS256", typ = "JWT", kid = 0 };
byte[] headerBytes = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(header, Formatting.None));
byte[] payloadBytes = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(payload, Formatting.None));
segments.Add(Base64UrlEncode(headerBytes)); segments.Add(Base64UrlEncode(payloadBytes));
var stringToSign = string.Join(".", segments.ToArray());
var bytesToSign = Encoding.UTF8.GetBytes(stringToSign);
byte[] signature = HMACSHA256(keyBytes, bytesToSign);
segments.Add(Base64UrlEncode(signature));
return string.Join(".", segments.ToArray());
}

// from JWT spec
private static string Base64UrlEncode(byte[] input)
{
var output = Convert.ToBase64String(input);
output = output.Split('=')[0]; // Remove any trailing '='s
output = output.Replace('+', '-'); // 62nd char of encoding
output = output.Replace('/', '_'); // 63rd char of encoding
return output;
}

internal static string TestJWT()
{
var privateKey = "secret";
var issueTime = DateTime.Now;
var utc0 = new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc);
var exp = (int)issueTime.AddMinutes(60).Subtract(utc0).TotalSeconds;
var payload = new
{
exp = exp,
ver = 1,
aud = "[Your AUD]",
uid = "[A unique identifier for the authenticated user]"
};
return JsonWebToken.Encode(payload, privateKey);
}
}

Enjoy your day

AndiP

Using HLP files in Windows 10

It is amazing how some vendors of libraries in the automation industry still require you to read help files in the old Microsoft hlp format.  Trying to open such a file results EDGE to show you the following screen.
Image Error opening Help in WIndows-based programs: "Feature not included" or "Help not supported"

If you think you can download and install the version for Windows 8.1. you are wrong. But do not throw away your downloaded MSU-File (for Windows 8.1 x64 the name is Windows8.1-KB917607-x64.msu).

Start your command prompt as Administrator!

imageFirst extract the content of the MSU File to another directory:

md ContentMSU
expand Windows8.1-KB917607-x64.msu /F:* .\ContentMSU

Now we can extract the contained CAB-File:

cd ContentMSU

md ContentCAB

expand Windows8.1-KB917607-x64.cab /F:* .\ContentCAB

This will extract 279 files.  Depending on your culture and language settings we need to locate the right MUI-File. My language is german so I use “de-”. English folk use “en-“.

cd ContentCAB
dir amd64*de-*.
People who use the x86 variant need to run “dir x86*de-*.”
Navigate to the given path, in my case

cd amd64_microsoft-windows-winhstb.resources_31bf3856ad364e35_6.3.9600.20470_de-de_1ab8cd412c1028d0

Here we will find “winhlp32.exe.mui”. We need to replace %SystemRoot%\de-de\winhlp32.exe.mui with our new file:

takeown /f "%SystemRoot%\de-de\winhlp32.exe.mui"
icacls "%SystemRoot%\de-de\winhlp32.exe.mui" /grant "%UserName%":F
ren %SystemRoot%\de-de\winhlp32.exe.mui winhlp32.exe.mui.w10
copy winhlp32.exe.mui %SystemRoot%\de-de\winhlp32.exe.mui


 

takeown /f "%SystemRoot%\winhlp32.exe"
icacls "%SystemRoot%\winhlp32.exe" /grant "%UserName%":F
ren %SystemRoot%\winhlp32.exe winhlp32.exe.w10

cd ..

dir *.exe /s
Find the right path starting either with amd64 or x86 and navigate to it
cd "amd64_microsoft-windows-winhstb_31bf3856ad364e35_6.3.9600.20470_none_1a54d9f2f676f6c2"
copy winhlp32.exe %SystemRoot%\winhlp32.exe

Cheers
AndiP

Fluid Simulation Integration with Blender

Dear readers, as promised I will now follow up with the integration of 3D objects in our tracked footage. It took a while to continue this series because my beloved father unfortunately passed away.

Today I will create some fluid simulations for our kitchen scene. Last time we tracked only the relevant part of the footage which produced a tracked camera from frame 431 to frame 618. First we will now cut down the background footage. Then we modify the tracked camera to start a frame 0.

Preparations

Img01Change the camera animation in Blender
In Blender change the timeline to the dope sheet and navigate to frame 430 (one before our relevant first frame). Zoom in so that you can see the key frames. Now select the menu “SELECT – Before current frame”. With the mouse over the dope sheet hit “x” to delete all key frames from 0 – 430.

Now navigate to frame 431 and select the menu “SELECT – After current frame”.  Hit “g” and type “431-“ which will move all the selected key frames to frame 0. Make sure to type the minus character at the end!

Switch back to the timeline and set start and end  accordingly (0/187).

Img02Use Adobe Media Encoder to cut down our background footage
Open Adobe Media Encoder and open preferences (STRG+,). Select Appearance/Display Format. Make sure that the frame rate is set accordingly to the frame rate of the background footage. (In my case 25 FPS).
Close preferences and drag in your footage. Choose AVI uncompressed and name the file. In my case “SHOOT27_431-618.avi”. Since Adobe Media Encoder gives us no direct way to show the current frame we are at we need to work with the time code provided. Since I recorded my footage without resetting the time code of my camera the clip starts with the time code: 11:40:51:14 which translates to 11 hours, 40 minuts, 51 seconds and 15 frames. The last number defines the frame in the case of 25FPS the value ranges from 0-24.

Img03Since we do want to encode starting at frame 431 we need to add this to the current starting frame 14. 431+14 = 445. Click on the time code to change it and change the last number to 445. Img04

As soon as you hit <ENTER> to confirm the value the time code calculates correctly to: 11:41:08:20. Now set the IN-POINT by clicking on the button right next to the time code.

Similar to the start frame we navigate to the required end frame. We move the current position to the first frame and again change the number 14 to 618+14 = 632. After the current position changed we can set our OUT-POINT by clicking on the 2nd button from left next to the time code.  Make sure that the output video has the same aspect ration and the same key frame rate as the original footage and encode the video.

Create 3D Model and set background footage

Img-07Now lets create our 3D Model of a drinking glass and a plane below where we do want to simulate a wet surface later. I won’t explain the creating a simple model for a drinking  glass in this tutorial. Align the 3D Models with the 3D Model you got from PF-Track. In my case this is the part of the kitchen with the window.

Switch the renderer to CYCLES in the top dropdown on blender (see image). Next we do want to load in our background footage so we can move the PF-Track model of our kitchen to another layer. With the mouse in the 3D View hit “n” to show the right toolbar. Img-08 

Scroll down to “Background Images” and select the checkbox. Then load in the background movie and set the settings like illustrated in the screen shot.

Now we will set the material for our two objects. Select the drinking glass, then  the material tab and create a new material by clicking on the button “+ New Material”.

Img-09

Name the material glass. Select “Glass BSDF” shader and set the color to pure white (1.0/1.0/1.0. Default is 0.8/0.8/0.8). Leave the refraction index (IOR) at 1.310. You can find a list of materials with their corresponding refraction indices here.

 

 

Since we also want to render the background footage so that it’s reflection is caught in the class we need to change the “world”-settings. In the “world-tab” open up the section “surface” and click on the “use nodes”-Button.  Then click on the small button right to the color and select “Image-Texture” from the pop up menu. Select the AVI-file we created earlier, set the amount of frames, start frame and AutoRefresh. Make sure you set Vector to “Texture Coordinate | Window”.

1. Img-08-1 2. Img-08-2 3. Img-08-3

Create the fluid simulation system

Create a fluid domain

Img13-FluidDomainCreate a cube around the surface and the drinking glass. Switch to wireframe to see the objects within. This will be our water simulation domain. Create a materialWater” for this object. Again use the Glass BSDF shader but change the refraction index (IOR) to 1.301 and the color to pure white (1/1/1). Select the cube and switch to the tab “Physics” and click on “Fluid”. Set the type to “Domain”.

TIME: The timing of the simulation is very important. There are textboxes for Start and End.  This indicates Start and End Time in SECONDS! So in our case start with 0 and end with 186 (frames) / 25 (FPS) = 7.44 (~7.4). Set the SPEED setting to 1 which indicates normal speed.

DOMAIN SIZE: To create a realistic water simulation the simulator needs to know the size the domain cube represents in the real world. Under section “Fluid World” find the setting “Real World Size”. This value indicates the longest side of the cube in meters. So a value of 0.4 represents 40 cm.

SLIP-TYPE: You can find these settings under the section “Fluid Boundary”.   The slip-type determines the stickiness of the surface of the boundary (surface adhesion). You can change the surfaces Smoothing Options (0 = off, 1=Standard,…) and Subdivision (the resolution of the surface for the calculations: 1=off, 2=1 subdivide, 3=2 subdivides,…). Be careful! A high value of the resolution increases the calculation of the simulation significant.

PARTICLES: To create a more realistic simulation use particles. To be able to use particles (splash when hit boundary/obstacles) you need to set the subdivision (Boundary settings) to at least 2. Tracer allows to define how much particles already exist at the beginning od the simulation.

Create fluid obstacles

Although it is possible to use the objects we already created as obstacles in the simulation it sometimes can be more effective to create simplified versions of the objects as obstacles to reduce calculation time.  To illustrate this duplicate the objects drinking glass and the “wet” surface and name them “Drinking glass obstacle” and “kitchenette surface obstacle” or similar.

Make sure you turn rendering off for these obstacle objects. In the “Physics”-Tab select “Fluid” and choose “Obstacle” as type.
Img10-DrinkingGlassObstacleImg12-PlaneObstacle

Since we have our surface set as obstacle we can modify our visible kitchenette surface and modify it to a wet ground. For that subdivide the visible plane several times and use extraction tools to shape it like a little water surface (see an example in the next section below the glass).

Create InFlow object

Img11-InflowSphereSince we do want the water mysteriously appear in the middle of the air and fill the glass we need an InFlow-Object to indicate where the water will come from. Create a small sphere which you also need to hide from rendering. This sphere must reside inside the fluid domain!

Img11-InflowSettingsIn the “Physics”-tab activate “Fluid” and set the type to “In-Flow”.  We initialize the volume with the volume of the object (sphere) so we set the “Volume initialization” to “Volume”. In my case I want to have velocity along the positive x axis so I set the X velocity to 0.6.

We also do not want to poor constantly water into the glass so we enable the InFlow object only for a brief moment in time. For that we navigate to frame 0 in our time line. Activate the “Enabled”-Checkbox and right click it to  insert a key frame. Move forward to frame 18. Deactivate the “Enabled”-Checkbox and set another key frame here.

Bake the fluid simulation

To bake the fluid simulation simply switch to the fluid domain object and hit the “Bake”-Button in the Physics-Tab. This will take a time. After that you will see the simulation when you scrub through the timeline and also when you render it:

Img14-BakedWater Img15-RenderNoNodes

Preparing render for After Effects composition

Finally we do not want to render the background footage directly but composite it later in After Effects. To do that we need to extract the background footage. Since we want to keep all the reflections we cannot simply remove the background render in the world-tab.

PassINdexFor the the drinking glass and the fluid domain set the PASS INDEX to 1 (object tab). For the water surface set the PASS INDEX to 2. Switch to “Nodes”-View and and select “World” in the bottom toolbar. Use the ID-Mask nodes to isolate a alpha map for the objects with the pass index 1 and two. Then use the “Set Alpha” node to isolate the object from the rendered image. We now can take the result from ID 2 and modify it with RGB curves and make it slightly transparent by using an Alpha Over. We place the result of the isolation in the foreground (lower image input) and the the upper image to black (0/0/0) with Alpha 0. We use the factor to make the water surface transparent. In our case I set this to 0.427. We then combine the water surface and the class with the water again with an Alpha Over and take the result as final render.

Composite-Nodes

Img16RenderedNodes

Now we can finally render our animation and integrate it with the original footage in After Effects. I hope you have enjoyed this tutorial.

Cheers
AndiP

Blender 3D Integration with PFTrack

imageI recently bought a DJI Ronin to be able to do smooth shots. Not only a smooth shot is much more enjoyable for the eye it also is much easier to track if you do 3D-Integration. While shooting smooth shots with the DJI Ronin still requires a lot of practice the shots are a lot smoother as if they were shot plain handed.

Today I will show you how to import a moving shot into PF-Track. Then we will of course track the camera but also to position a 3D-Model (glass) on the kitchenette.  In the image on the right I have visualized my shot where my camera moves from left to right.

I structured the post with the following topics

  • PFTrack application
    • Creating the required nodes in PFTrack
    • Configure “Photo Survey” node, match points and solve camera
    • Configure “Photo Mesh” and create the mesh
    • Exporting mesh and camera data to Blender
  • Blender application
    • Importing mesh and camera data into Blender
    • Verification (optional) but recommended

I am using PFTrack 2015.05.15 for this. Create a new project PFTrack first. Change to project view by clicking on the “PRJ” Button in the left lower corner (image).  Click the “Create”-Button, fill out Name, Path,… and click “Confirm”. Enable filebrowser and project media browser by clicking on the corresponding icons in the top of the application (image). Import the footage by dragging it into the “Default”-Folder or create your own project structure. 

Creating the required nodes in PFTrack

Drag your shot to the node window. In the lower left corner enable the “nodes-menu” (image).  Click on “Create” to create an “Photo Survey”-node. Setup “Photo Mesh”- and “Export”- nodes with the same procedure. Your node tree should look like this:
image

Configure “Photo Survey” node, match points and solve camera

imageDouble click the “Auto Track” node. Since the calculations take quite some time we should only calculate what is necessary. Since I have a much longer recording (switching the camera on/off while you are holding the heavy DJI Ronin is quite a challenge) I only need to track a small portion. In my case from frame 431 to 618. Open “Parameters” of the “Photo Survey”-node and setStart frame” and “End frame” in the “Point Matching” section. Finally hit “AutoMatch” (image) and wait until the calculations are done.

After the points have been tracked click on the “Solve all” button (image) in the “Camera Solver” section. If you enable the split view (see buttons on the right corner) you will end up with a point cloud and a solved camera:
image

Configure “Photo Mesh” node and create the mesh

After solving the camera we need to create depth maps for each frame and create a mesh. Note that you won’t get a perfect mesh but it will suffice to help place things in the 3D world in Blender later. Switch to the “Photo Mesh” nodes. If you do not require all points set the bounding box accordingly in the “Scene” section. To do this click the “Edit” button.  If you hover now over the planes of the bounding box in the 3D view they will highlight and can be moved by dragging them with the mouse. Once you are finished hit “Edit” again.

Let’s create the depth maps next. Depending on your requirement set the Depth Maps resolution to “Low”, “Medium” or “High”. Be aware that a higher resolution results in a much longer calculation. I left the variation % at the default of 20 and set my resolution to “Medium”. Now hit the “Create” button in the “Depth Mapssection. This will take a while.

After building the depth maps we can create the mesh. Note that you also could create a smaller portion of the mesh by setting the bounding box in the “Mesh” section.  Create the mesh simply by hitting the “Createbutton in the “Meshsection. And finally we should have our mesh:
image

Exporting mesh and camera data to Blender

PFTrack offers to export the mesh and camera data in various formats: “Open Alembic”, “Autodesk FBX 2010 (binary)”. Also you can export the mesh without camera to “Wavefront OBJ” and “PLY”. “Open Alembic” export fails on my windows pc and I have not been able to use that so far.

For Blender we we should have two options: “Autodesk FBX 2010 (binary)” and “Wavefront OBJ”.

Unfortunately we have two issues with the FBX format. First of all Blender can only import “Autodesk FBX 2013 (binary)”. Therefore we need an extra step in converting the fbx-file with Autodesks FBX Converter 2013.2. This allows us to import the cameras and the mesh, but the camera rotations are completely messed up Sad smile.  I do not know if this is a bug in Blender or PFTrack but it does not help to make a smooth workflow. So what is the solution?

imageThe solution is to split up camera- and mesh export. So first we export the mesh as “Wavefront OBJ”. Since Blender uses the z-axis for up/down we change the default settings for the coordinate system toRighthanded” and “Z up”. Then we name an output file (f.e. Kitchen-z-up.obj) and click on the button “Export Mesh”.

To export the camera data we use the previously created “Export” node that is connected to the “Photo Survey” node. In the parameters of the “Export” node we select the formatCollada DAE”. Choose what to export in the TABs on the right side. Since I won’t be needing the point cloud I removed the point cloud from the export. Make sure that the camera is selected and “Separate Frame” is not checked. If checked PFTrack would create a separate camera for each frame. Since we do want to render an animation later leave that unchecked. Name the output file (f.e. KitchenNoPC.dae) and hit the “Export Scenebutton.

image

So we end up with two files. One (Kitchen-z-up.obj) contains our model and the other (KitchenNoPC.dae) our animated tracked camera.

Importing mesh and camera data into Blender

Start up Blender (I am using Version 2.74).  Open user preferences (STRG+ALT+U) and select the “AddOns”-Tab. Select the categoryImport-Export” and make sure that “Import-Export Wavefront OBJ format– AddOn is selected.

First make sure that our render settings are set correctly. We set resolution (should match with footage) and frame rate.
(It is crucial to set the frame rate correctly before you import the animated camera !! Otherwise the camera will be out of sync even if you change the frame rate later!!)
image

imageSelect “File/Import/Wavefront (.obj)” from the menu. Navigate to mesh obj file you created with PFTrack (f.e. “Kitchen-z-up.obj”) Make sure that you change the import settings in the left lower corner as shown in this image:

Then click on “Import” to import the object.

 

Select “File/Import/Collada (Default) (.dae)” from the menu. Navigate to the exported collada camera track file (f.e. KitchenNoPC.dae) and clickImport COLLADA”.

This will import two objects: An empty object “CameraGroup_1” and the animated cameraImageCamera01_2” (names can vary of course). Although the position of the camera after the import looks correct, the position of the camera will rotate 90 degrees on the global x-axis once you scrub through the timeline. I assume that the Pixelfarm team meant to parent the “ImageCamera01_2” to the “CameraGroup_1” because the empty object is rotated 90 degrees on the x-axis.

imageSo simply select the animated cameraImageCamera01_2”. In the object settings (image) select parent and choose the empty object CameraGroup_1”.

And we are almost finished. Since PFTrack exports the camera over the full length of the shot you might want to define the animation range in the timeline window like so:
image

Finally we need to fix the field of view of the camera which is also not correctly imported/exported by PFTrack. In PFTrack double click on the “Photo Survey” node and you can find the camera settings in the camera tab:
image

So back in Blender select the animated camera (“ImageCamera01_2”) and switch to the camera settings. Change the sensor to “Horizontal” and set the width to the film back value from PFTrack. In this case “14.75680”.
(!! Make sure your render settings are set to the same aspect ratio as your footage !!).
image

Then change the focal length of the camera to the value from PFTrack. In this case “12.851”.
image

imageVerification (optional but recommended): To see if everything is correct I recommend to load the original footage as background and see if it matches correctly. Mistake with f.e.  the frame rate settings happen easily. To do this select the animated camera again. With the mouse cursor in the 3D View hitN” to show the settings of the selected object in the right bar in the 3D View.

Find “Background Image” setting and check it. Then hit the “Add Image” button.

Then selectMovie Clip” instead of “Image”. Uncheck the option “Camera Clip”. ClickOpen” and navigate to the footage and click “Open Clip”.

Then click “Front” and set the Opacity to 0.500.
Now you can scrub through the time line and see if everything lines up perfectly.

image

Next thing of course is to create some 3D objects and place them on the table. For the final render we simply move the mesh to another layer and mix the original footage with our CGI objects in After Effects. Pay attention to things like lightning and reflection. Maybe a topic for another post Smile. So long.

Cheers
AndiP

Connect ZyWALL 35 with Azure VPN site to site

Some time ago I got an “old” ZyWALL 35 from an ex colleague and I always wanted to configure a site-to-site connection  to Azure. Although Microsoft only provides automatic scripts for the more advanced professional enterprise VPN gateways you can configure the device (if it is capable of VPN) yourself. This however can have some caveats like different expected key sizes, where you need to work around.

Hopefully this helps others with a ZyWALL35 to configure site-2-site connection and also those who also happen to have a different device.

I divided the article into the following sections:

  1. Setting up the virtual network environment in Azure
  2. Set Shared Key Length in Azure VNET to 31
  3. Configuring the ZyWALL35

May you succeed Smile.  Cheers AndiP!

1. Setting up the virtual network environment in Azure

My small private local network is operating in the following address range: 10.0.0.0/24 (10.0.0.0 – 10.0.0.255). I do want to have my Azure machines operate in the local VNET in the address range 10.0.1.0/24. So first of all we will create a virtual network in Azure for that purpose. Log into the Azure Management Portal. With the “+ NEW” button in the lower left corner we create a new virtual network.

image

image

image

Then hit the “Create”-Button. After Azure has created the virtual network we are presented with the Dashboard of our new virtual network. From there we add more Subnet’s.

image

Here we add another subnet (Do not forget to save the changes with the SAVE button!). I left some address range empty because this will be required by the Gateway-Subnet for the VPN Site-to-Site connection. Unfortunately you cannot add the Gateway-Subnet here in the new portal.

Name Address Space CIDR Block
Subnet-2 10.0.1.0/24 10.0.1.32/27

Now we need to configure our site to site connection by clicking into the VPN connection section. Inside the VPN Connection settings we select “Site-to-site” and give the local site a name (in our case “SpectoLogicLocalVPN”). As VPN gateway IP address we provide the public facing IP address of our ZyWALL 35 behind the internet modem. Finally we provide the address ranges of our local network we would like to connect. Due to a bug in the new Azure Portal you need to checkCreate gateway immediately”*.

It is also important to set the optional gateway configuration (see images below). Set the routing type to “Static”!

*Otherwise you will get the error “Deployment Failed”. The reason might be that you are not able to create the required Subnet-Gateway in the SubNet section. Once you created a site-to-site connection and remove it again you cannot remove the new created “Gateway Subnet” in the subnets, even though you would not require it any longer (another bug).

image    image

image

We can now see the new automatically added subnet “GatewaySubNet”:
image

Creating the gateway will take some time. (UP to 25 minutes)!

Once the gateway is created we will see the result here. The public gateway IP-Address will be needed later when we configuring our ZyWALL35 device!
image

2. Set Shared Key Length in Azure VNET to 31

Azure  uses SharedKeys that are bigger than the ZyWALL 35 supports. So we have to set the key size in Azure manually via PowerShell Script to change it to a smaller value. In our case “31”!

As David pointed out in the comment section there is now an easier way to achieve this.

We also need to switch to the old Azure Portal as the new portal does not allow the shared key management. Navigate to your VNET and you will find the “Manage Key”-Button in the Dashboard of the VNET:
image

Unfortunately the key is 32 characters long:
image

While we are at it. Although we gave meaningful names in our new Azure Portal the names of the local VPN Network and the Azure Network are completely differing from what we originally entered. I assume that this is because of the new “Resource Group” management. It makes things a bit more complex as we need the “real” names for our powershell script.

The VNETName can be found either in the new portal here Surprised smile:
image
or in the old portal here:
image

The local network name can be found in the new portal here Surprised smile Surprised smile Surprised smile:
image
or in the old portal here:
image

Now, after we somehow managed to find out the real names we can use them in our script below. Make sure you have imported the Azure publishing settings file and that you have imported the certificate either to your personal or local machine store. If not you need to know the Thumbprint of the certificate and assign it to the variable $mgmtCertThumb. If the certificate can be found in the store the script will locate it for you:

# Sets a vpn key with a smaller keylength
# © by Andreas Pollak / SpectoLogic

$subID = (Get-AzureSubscription -Current).SubscriptionId
$VNetName = „Group SpectoLogic_Resources SpectoLogicVPN
$VNetNameLocal = „9A10F5F7_SpectoLogicLocalVPN
$uri = „
https://management.core.windows.net/“+$subID+“/services/networking/“+$VNetName+“/gateway/connection/“+$VNetNameLocal+“/sharedkey“
$body = ‚<?xml version=“1.0″ encoding=“utf-8″?><ResetSharedKey xmlns=“
http://schemas.microsoft.com/windowsazure“><KeyLength>31</KeyLength></ResetSharedKey>‘;

#Identify Management Certificate Thumbprint
$mgmtCertThumb = $null

$mgmtCertCandidateCount = (Get-ChildItem -path cert:\CurrentUser\My\ | Where-Object {$_.FriendlyName -like ((Get-AzureSubscription -Current).SubscriptionName + ‚*‘)}).Count
if ($mgmtCertCandidateCount -ne 1)
{
$mgmtCertCandidateCount = (Get-ChildItem -path cert:\LocalMachine\My\ | Where-Object {$_.FriendlyName -like ((Get-AzureSubscription -Current).SubscriptionName + ‚*‘)}).Count
if ($mgmtCertCandidateCount -eq 1)
{
$mgmtCertThumb = (Get-ChildItem -path cert:\LocalMachine\My\ | Where-Object {$_.FriendlyName -like ((Get-AzureSubscription -Current).SubscriptionName + ‚*‘)} | Select-Object -First 1).Thumbprint
}
else
{
echo „Could not locate the certificate thumbprint of the corresponding azure
management certificate!“
echo „Please make sure to install the azure management certificate in the ‚personal‘
folder “
echo „of either the ‚local machine‘ or ‚current user‘ certificate store on your
machine! “
echo „The friendly name of the certificate must start with the SubscriptionName to be
automatically detected! “
}
}
else
{
$mgmtCertThumb = (Get-ChildItem -path cert:\CurrentUser\My\ | Where-Object {$_.FriendlyName -like ((Get-AzureSubscription -Current).SubscriptionName + ‚*‘)} | Select-Object -First 1).Thumbprint
}

$headerDate = ‚2012-03-01‘
$headers = @{„x-ms-version“=“$headerDate“}

Invoke-RestMethod -Uri $uri -Method Put -Body $body -Headers $headers -CertificateThumbprint $mgmtCertThumb

After running the script we can now acquire our key and store it for later when we configure our ZyWALL35. It should be 31 characters long now. ATTENTION: DO NOT RECREATE THE KEY. Otherwise it will be 32 characters long again. Use the script again to regenerate the key!

3. Configuring the ZyWALL35

Finally we get to configure our ZyWALL35. First we download a script as text file where we can read the basic configuration settings like HASH, Encryption Algorithms,…

image

We log on to the ZyWALL35 Configuration Website and select VPN from the “Security”-Menu.

image image

Configure the Global Settings as shown in this screen shot:
image

Select the tab “VPN Rules (IKE)”.  Add a new gateway policy by clicking on the “add new gateway policy” button (image).

In the section “Property” we name our local VPN Gateway Policy “SpectoLogicVPNGWPolicy”. Also make sure NAT Traversal is checked!

In the section “Gateway Policy Information” we need to provide our local public IP address as well as the public Azure VPN Gateway address. So under “My ZyWALL” we provide our local public IP address:
image

Under “Primary Remote Gateway” we provide the public Azure Gateway IP-Address. In our home scenario we leave IPSec High Availability unchecked.
image

Since I have not set up any PKI Infrastructure I go for the simple “Pre-Shared-Key” Authentication. Note that the ZyWALL 35 only supports Pre-Shared-Keys that are 31 characters long. This conflicts with Azure, per default not allowing smaller key sizes. See section “Set Shared Key Length in Azure VNET to 31 “ above on how to change that.
image

We leave the extended authentication settings untouched (uncheck “Enable Extended Authentication”)  and configure the IKE Proposal.
image

Finally we hit apply. Back in the “VPN Rules (IKE)”-Tab  we select “Add Network Policy”:
image

We name the VPN Network Policy “SpectoLogic VPN Net Policy” and set it to active (check that checkbox!). Also check “Nailed-up”!
image

The linked Gateway Policy should already appear populated:
image

In the section “Local Network” we select “Subnet Address” from the “Address Type” dropdown and we provide the starting IP address in our local network and define the subnet mask for the range.
image

Now we also need to configure our remote network under the “Remote Network” section. Again we select “Subnet Address” from the “Address Type” dropdown and provide starting IP address and subnet mask:
image

For the IPSec Proposal select the same values we already used for the VPN Gateway Policy (Exception: PFS set to NONE!):
image

So we end up with:
image

To Connect / Disconnect the VPN in the new portal click on the following elements:
image
You also can pin the last element to your dashboard by right clicking on the name of the VNET in the middle section:
image

Finally we can enjoy our site-2-site connection (New Portal / Old Portal):

imageimage