Upload a Windows 2003 Hyper-V VHD to Azure (Part 2 of 2)

In part one of this two part blog post, I created a Hyper-V VHD disk from a Windows 2003 VMWare VM. Now I want to upload the VHD disk to Azure and create a VM there. I will run the Domino server on the VM and test connecting to it with a web browser and a Notes client.

Note: I provide an abundance of details with screenshots.

Log in to Azure

If you don’t already have PowerShell version 1.4 or above installed, read How to install and configure Azure PowerShell.

  1. Open Azure PowerShell and sign in to your Azure account. A pop-up window opens for you to enter your Azure account credentials.

Login-AzureRmAccount

  1. Get the subscription names for your available subscriptions.

Get-AzureRmSubscription | Sort-Object subscriptionName | Select-Object SubscriptionName

  1. Set the correct subscription using the subscription ID. Replace<subscriptionName> with the ID of the correct subscription.
Select-AzureRmSubscription -SubscriptionName Pay-As-You-Go

AzureSubscription

Get the storage account

You need a storage account in Azure to store the uploaded VM image. You can either use an existing storage account or create a new one.

To show the available storage accounts, type:

Get-AzureRmStorageAccount

If you need to create a storage account, follow these two steps:

1. You need the name of the resource group where the storage account should be created. To find out all the resource groups that are in your subscription, type:

Get-AzureRmResourceGroup

To create a resource group named vmResourceGroup in the Central US region, type:

New-AzureRmResourceGroup -Name vmResourceGroup -Location “Central US” -Tag @{Name = ‘Created By’;Value = ‘Randy’}

AzureResourceGroup

2. Create a storage account named vmrhrstorageaccount in this resource group by using the New-AzureRmStorageAccount cmdlet:

New-AzureRmStorageAccount -ResourceGroupName vmResourceGroup -Name vmrhrstorageaccount -Location “Central US” -SkuName “Standard_LRS” -Kind “Storage”

Valid values for -SkuName are:

  1. Standard_LRS– Locally redundant storage.
  2. Standard_ZRS– Zone redundant storage.
  3. Standard_GRS– Geo redundant storage.
  4. Standard_RAGRS– Read access geo redundant storage.
  5. Premium_LRS – Premium locally redundant storage

AzureStorage

Upload the VHD to your storage account

I know of three ways to upload the VHD to a storage account. I will briefly review how to use each one. I think PowerShell will always work; but Cloudberry has a Pause feature that I really like.

  1. PowerShell
  2. Storage Explorer
  3. Cloudberry Explorer

PowerShell

Use the Add-AzureRmVhd cmdlet to upload the image to a container in your storage account. I am uploading the file windows2003.vhd from “F:\Users\Randy\Documents\Virtual Machines\Domino R8.5 Azure\” to a storage account named vmrhrstorageaccount in the vmResourceGroup resource group. The file will be placed into the container named vmcontainer and the new file name will be vmWindows2003VHD.vhd.

The first time I ran this code, I specified 4 uploader threads to see if that would improve upload performance.

–NumberOfUploaderThreads 4

My understanding is that I need a bigger pipeline for uploading as the number of uploader threads is increased. So setting it to 32 with my current pipeline would be of no benefit. This time I am just going to use a single thread.

$rgName = “vmResourceGroup”$urlOfUploadedImageVhd = “https://vmrhrstorageaccount.blob.core.windows.net/vmcontainer/vmWindows2003VHD.vhd”Add-AzureRmVhd -ResourceGroupName $rgName -Destination $urlOfUploadedImageVhd -LocalFilePath “F:\Users\Randy\Documents\Virtual Machines\Domino R8.5 Azure\Windows2003.vhd”

AzureUpload

AzureMD5Hash

EmptyBlock

AzureUploading

21 hours later …

AzureUploading2

If successful, you get this response:

AzureUploadResponse

Storage Explorer

I will assume that you already have Azure Storage Explorer installed. Click on Upload.

AzureStorageExplorer

I select the VHD file that I want to upload.

AzureStorageExploreUpload

Initializing Upload took several minutes without any updates.

AzureStorageExplorerInitialize

The upload begins! Only 1407 hours …

AzureStorageExplorer1407

A few minutes later there is no change in how much is uploaded.

AzureStorageExplorer1848

I cancel the upload.

AzureStorageExplorerCancel

Perhaps Storage Explorer is not the right tool for what I need here.

Cloudberry Explorer

I download CloudBerry Explorer for Azure Blob Storage from http://www.cloudberrylab.com.

I install and run the software. I choose the Freeware edition.

CloudBerry

I configure a connection to my vmrhrstorageaccount on Azure.

CloudBerryConnection

I select my Windows2003.vhd file and click on Copy as a Page Blob. Page blob is mainly used for VHD’s. The Copy process begins.

Note the Pause button is what excites me! I want to be able to pause the upload at times. For example, I use my Internet connection for all voice calls.

CloudBerryPause

The Blob file is being updated …

CloudBerryVMcontainer

The upload is 100% completed. It took about 24 hours to complete … just a few hours longer than the initial estimate.

CloudBerryCompleted

Check Azure Storage

I open my Azure site to the storage account that I created:

AzureStorageAccount

I open the Blobs. I can see the vmcontainer that I created.

AzureBlobStorage

I click on vmcontainer. I can see the vhd file that I uploaded.

AzureResourceContainerCreated

Create a Managed Disk from the VHD

I am creating a managed disk. A managed disk manages the storage accounts used for the VM disks for you. You specify the type (Premium or Standard) and size of disk you need, and Azure creates and manages the disk for you. You don’t have to worry about placing the disks across multiple storage accounts in order to ensure you stay within the scalability limits for the storage accounts — Azure handles that for you.

I am referencing another resource for creating the VM. I like to use PowerShell.

https://docs.microsoft.com/en-us/azure/virtual-machines/windows/create-vm-specialized

I run the PowerShell code to create a new resource group and a new OS disk from the uploaded VHD.

Note: I had problems with New-AzureRmDiskConfig. So I went to this webpage and followed the instructions.

https://docs.microsoft.com/en-us/powershell/azure/install-azurerm-ps?view=azurermps-5.1.1

I closed and restarted PowerShell. Then went through the login process again.

Login-AzureRmAccount

Then I continued with the PowerShell commands below.

$location = “Central US”

$destinationResourceGroup = “myWindows2003ResourceGroup”

New-AzureRmResourceGroup -Location $location -Name $destinationResourceGroup

$sourceUri = “https://vmrhrstorageaccount.blob.core.windows.net/vmcontainer/Windows2003.vhd

$osDiskName = “myWindows2003Disk”

$diskconfig = New-AzureRmDiskConfig -Location “Central US” -AccountType StandardLRS  -CreateOption Import -SourceUri $sourceUri

$osDisk = New-AzureRmDisk -DiskName $osDiskName -Disk $diskconfig -ResourceGroupName $destinationResourceGroup

The new resource group is created

AzureResourceGroupCreated

The operating system disk is also created! I can see it in the new resource group that I created.

The disk is listed in the new resource group.

AzureOSDisk

I can click on the disk to view the properties.

AzureOSDiskDetails

Create the new VM

First I need to create the subNet and vNet of the virtual network.

Below is the PowerShell I used to create the subnet and vNet.

$subnetName = ‘win2003SubNet’

$singleSubnet = New-AzureRmVirtualNetworkSubnetConfig `

   -Name $subnetName `

   -AddressPrefix 10.0.0.0/24

$location = “Central US”

$destinationResourceGroup = “myWindows2003ResourceGroup”

$vnetName = “win2003VnetName”

$vnet = New-AzureRmVirtualNetwork `

   -Name $vnetName -ResourceGroupName $destinationResourceGroup `

   -Location $location `

   -AddressPrefix 10.0.0.0/16 `

   -Subnet $singleSubnet

I receive a warning:

WARNING: The output object type of this cmdlet will be modified in a future release.

Now I must be able to log in to my VM using RDP. I need to have a security rule that allows RDP access on port 3389. Because the VHD for the new VM was created from an existing specialized VM, I can use an account from the source virtual machine for RDP.

The Lotus Notes Domino web server uses port 80 to distribute and receive http requests.

The Lotus Notes Client uses port 1352 by default to communicate with the Lotus Notes Server.

Lotus Notes servers use port 1352 by default to replicate with each other.

Thus, I also want to allow access on port 80 and 1352.

$nsgName = “myWindows2003NSG”

$rdpRule = New-AzureRmNetworkSecurityRuleConfig -Name “myRDPRule” -Description “Allow RDP” `

    -Access Allow -Protocol “Tcp” -Direction Inbound -Priority 110 `

    -SourceAddressPrefix Internet -SourcePortRange * `

    -DestinationAddressPrefix * -DestinationPortRange 3389

$httprule = New-AzureRmNetworkSecurityRuleConfig -Name “myHTTPRule” -Description “Allow HTTP” `

    -Access “Allow” -Protocol “Tcp” -Direction “Inbound” -Priority “100” `

    -SourceAddressPrefix “Internet” -SourcePortRange * `

    -DestinationAddressPrefix * -DestinationPortRange 80

$notesrule = New-AzureRmNetworkSecurityRuleConfig -Name “myIBMNotesRule” -Description “Allow IBM Notes” `

    -Access “Allow” -Protocol “Tcp” -Direction “Inbound” -Priority “120” `

    -SourceAddressPrefix “Internet” -SourcePortRange * `

    -DestinationAddressPrefix * -DestinationPortRange 1352

$nsg = New-AzureRmNetworkSecurityGroup `

   -ResourceGroupName $destinationResourceGroup `

   -Location $location `

   -Name $nsgName -SecurityRules $rdpRule, $httprule, $notesrule

I receive the same warning:

WARNING: The output object type of this cmdlet will be modified in a future release.

I review the virtual network that I created.

AzureVirtualNetwork

I click on the win2003VnetName virtual network and then on Subnets.

AzureNetworkSubnets

I see one warning in Address space. I just have to be careful not to run my SharePoint server at the same time! But I will probably change the Vnet settings soon.

AzureVnetSetting

Next I open my new network security group. I can see that my inbound security rules were successfully created.

AzureNSG

The RDP rule has a warning.

RDPWarning

Create a public IP address and NIC

To enable communication with the virtual machine in the virtual network, I need a public IP address and a network interface.

$destinationResourceGroup = “myWindows2003ResourceGroup”

$ipName = “myWindows2003IP”

$pip = New-AzureRmPublicIpAddress `

   -Name $ipName -ResourceGroupName $destinationResourceGroup `

   -Location $location `

   -AllocationMethod Dynamic

$nicName = “myWindows2003NicName”

$nic = New-AzureRmNetworkInterface -Name $nicName `

   -ResourceGroupName $destinationResourceGroup `

   -Location $location -SubnetId $vnet.Subnets[0].Id `

   -PublicIpAddressId $pip.Id `

   -NetworkSecurityGroupId $nsg.Id

I open the new NIC that I created in Azure. Everything looks like it set correctly.

AzureNIC

The IP address for the Public IP address is not set yet.

AzureIPAddress

Next I select the VM type, VM name, disk size, and the NIC.

My VMWare VM has a 40GB disk drive, dual core processor, and 3032MB of memory. I need something similar on Azure. I can see the list of available VMs at the URL here:

https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general

The Basic_A1 series has 1 CPU, 1.75GB of memory, 40GB disk drive, and 1 NICs. It is one less CPU and half the memory that I currently have on my VMWare VM.

The Basic_A2 series has 2 CPU, 3.5GB of memory, 60GB disk drive, and 1 NICs. A Basic is an economical option for development workloads, test servers, build servers, code repositories, low-traffic websites and web applications, micro services, early product experiments, and small databases.

I want to try the Basic_A2 series; but I will exceed my quota limit of 10 cores by 1 core.

I create a Support Request to increase my quota to 20. https://portal.azure.com/#create/Microsoft.Support

My Support Request is approved within two minutes!!! Awesome!

I will proceed with Basic_A2 series.

$vmName = “myWindows2003VM”

$vmConfig = New-AzureRmVMConfig -VMName $vmName -VMSize “Basic_A2”

$vm = Add-AzureRmVMNetworkInterface -VM $vmConfig -Id $nic.Id

$vm = Set-AzureRmVMOSDisk -VM $vm -ManagedDiskId $osDisk.Id -StorageAccountType StandardLRS `

    -DiskSizeInGB 40 -CreateOption Attach -Windows

New-AzureRmVM -ResourceGroupName $destinationResourceGroup -Location $location -VM $vm

I check Azure for my new Windows 2003 VM. It is running!

AzureVMCreated

I click on the VM to view the Overview details.

AzureVMDetails

I click on Connect to view the VM. I click Open to continue.

RDP

I click Connect to continue.

Note: Ignore the IP address in the screenshot below. It should match the public IP address.

RDP2

I click Yes to continue.

RDP3

The first time I tried this, I got a warning message. Basically, I believe that it failed because I did not have a network adapted on the VM that I uploaded.

RDPCannotConnect

But this time it looks like it is working! I have a connection and can log into the server.

RDP4

Logon

I successfully login.

VMlogon

Auto-Shutdown of Azure VM

As a precaution, I set my VM to auto-shutdown at 5:00PM Central Time. This setting is made in Azure. I try to do this as soon as possible to avoid any unnecessary expenses while I am testing.

AzureVMAutoShutdown

Update Windows Firewall to Open Port 1352

I had to update the Windows Firewall on the Windows 2003 VM to open port 1352 for the Notes client. I open the Local Area Connection. Then I open Windows Firewall and click on the Exceptions tab. Then click on Add Port.

FirewallExceptions

I add the details for port 1352 as seen below. I do not make any changes to the default scope. I click OK to accept the changes.

FirewallPort

I click OK to close the Windows Firewall dialog.

Update Hosts, LMHosts.sam, and Server Connection

I updated the IP address in the hosts and lmhosts.sam files to point to the local IP address. You may need to do the same if you use these files for host resolution.

I also updated the connection record in the local address book in the Lotus Notes client to use the new local IP address.

ServerConnection

Start the Domino Server

I start up the Domino 8.5.3 Server successfully.

ServerScreen

Test Connection via Web Browser

I came back a day later to test connecting to the Domino web server on Azure. I start the VM and the Domino server.

I have to run “load http” on my Domino server to start the web services. You may not have to do so.

Then I open a web browser on my computer and open up the URL http://52.165.134.152/test.nsf

The IP address 52.165.134.152 is the new public IP address. It will change every time I start the Azure VM.

Test.nsf is a Lotus Notes database on my Domino server. The test database opens in a web browser. Success!

WebTestConnection

Test Connection via Lotus Notes Client

Next I will test connecting from my Lotus Notes client. I update the connection document to point connections to the Domino server to the public IP address:

ServerConnection2

I updated the IP address in the hosts and lmhosts.sam files to point to the public IP address.

Then I attempt to open a Notes database on the Domino server. Success!

OpenApplication

I select my Test.nsf database and open it. Success again!

TestDatabase

Next Steps

To make this a permanent solution, I need to configure a permanent IP address for the public IP address in the resource group and a permanent IP address for the Windows 2003 Server in the virtual network / subnet. Then I won’t need to update the IP address references everywhere each time I restart the Azure VM.

I should update the Domino Server record to point to the updated IP address or DNS name.

I should also validate the licensing of the Windows 2003 server. It is warning me that I need to validate the licensing again since the hardware changed significantly.

I could try a lift-and-shift migration using Azure Site Recovery. I think that would be a good approach for a large scale migration of VMs.

Conclusion

Thus, I was able to successfully migrate a Windows 2003 VMWare VM to an Azure VM. I was also able to connect to the Domino server running on the VM via a web browser and the Notes client.

Additional Resources

The link below provides documentation on how to create and upload a Windows virtual hard disk (VHD) to be used in creating an Azure VM. You can upload a VHD from either a generalized VM or a specialized VM.

https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-upload-image

For a complete walk-through of how to prepare, upload and create a new VM using managed disks, see Create a new VM from a generalized VHD uploaded to Azure using Managed Disks or Upload a specialized VHD to create a VM in Azure.+

For more details about disks and VHDs in Azure, see About disks and VHDs for virtual machines.

 

RDP

Preparing a Windows 2003 Server VMWare VM for Import to Azure VM (Part 1 of 2)

Introduction

I have a read a few blog posts on uploading old VMs to Azure. Some of the blog posts make it look like it is a very simple process. However, the comments to the posts often describe a failure and request additional details. I’m going to post the details about my experience. I will also provide some explanation on where I ran into some problems and explain what I do to resolve them.

This is part one of a two part blog posting. Part one ends with the conversion of a Windows 2003 VMWare VM to a Hyper-V VHD disk. Part two provides the details to upload the VHD disk to Azure and create a VM there. Then I run the Domino server on the VM and test connecting to it with a web browser and a Notes client.

Note: I provide an abundance of details with screenshots.

My Current VMWare VM

I have an old Windows 2003 Server running in a VMWare Workstation VM. I used to run a SharePoint 2010 Server on it; but now I use it to run a Domino R8.5 Server. I suspect that a lot of organizations have Domino Servers running on Windows 2003 Servers. The preparation steps would be slightly different if they upgraded their Windows 2003 Server to Windows 2008 or 2012. The differences would be with the configuration of remote access on the server.

I want to upload the Windows 2003 Server to an Azure VM. It has a 40GB disk drive with 7GB of free disk space. It is running a 32-bit version of Windows 2003 Server with Service Pack 2 installed. I know that Azure provides best effort support only for running Windows 2003 servers – both 32-bit and 64-bit.

Below you can see a screenshot of the System Properties of my Windows 2003 Server.

SystemProperties

I know that I can upload both generalized and specialized VHDs to Azure. Each type requires that you prepare the VM before starting. I want to create a specialized VM.

Generalized VHD – a generalized VHD has had all of your personal account information removed using Sysprep. If you intend to use the VHD as an image to create new VMs from, you should:

Specialized VHD – a specialized VHD maintains the user accounts, applications and other state data from your original VM. If you intend to use the VHD as-is to create a new VM, ensure the following steps are completed.

  • Prepare a Windows VHD to upload to Azure.
  • Do not generalize the VM using Sysprep.
  • Remove any guest virtualization tools and agents that are installed on the VM (i.e. VMware tools).
  • Ensure the VM is configured to pull its IP address and DNS settings via DHCP. This ensures that the server obtains an IP address within the VNet when it starts up.

Create a Clone

I make sure that my Windows 2003 Server VM is shut down. Then I create a clone of my Windows 2003 Server VM.

The Welcome Screen appears. I click Next to continue.

CloneWelcome

The Clone Source screen appears. I accept the current settings and click Next to continue.

CloneSource

The Clone Type screen appears. I select Create a full clone and click Next to continue.

CloneType

The Clone Name screen appears. I update the name and click Finish to continue.

CloneName

The cloning process begins.

CloningProcess

The final screen appears and I click Close.

CloneClose

Preparing the VM

I power on the cloned VM.

ClonePowerOn

I often get this error displayed; but it never seems to be a problem. I click OK to continue.

ServiceError

I check the VMWare VM Network Configuration.

IP1IP2IP3

I changed the setting to obtain an IP address automatically. This is the right configuration; but it won’t really matter since I lose the network adapter when I convert to a Hyper-V disk drive.

I clicked OK.

IP4

Windows Management Core Package

I want to check that I have installed the Windows Management Core Package installed on my Windows 2003 Server.

I run PowerShell. I happen to have it on my desktop.

Powershell

I double-click on the icon and PowerShell starts. It looks like I have v1.0 installed.

Powershell2

The package I want to download contains PowerShell 2.0.

I can download the package from here: http://www.microsoft.com/en-us/download/details.aspx?id=4045

I click on Download.

Windows2003update

I have to click on click here to download manually.

downloadUpdate

But that still does not download the file.

I know that the URL to the file is https://download.microsoft.com/download/1/1/7/117FB25C-BB2D-41E1-B01E-0FEB0BC72C30/WindowsServer2003-KB968930-x86-ENG.exe

I return to my PowerShell window.

I copy and paste the following commands:

$url = “http://download.microsoft.com/download/1/1/7/117FB25C-BB2D-41E1-B01E-0FEB0BC72C30/WindowsServer2003-KB968930-x86-ENG.exe&#8221;

$path = “C:\download\WindowsServer2003-KB968930-x86-ENG.exe”

# param([string]$url, [string]$path)

if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent $path))) {

      $path = Join-Path $pwd (Split-Path -leaf $path)

    }

“Downloading [$url]`nSaving at [$path]”

$client = new-object System.Net.WebClient

$client.DownloadFile($url, $path)

#$client.DownloadData($url, $path)

$path

Note that I removed the “s” in https://.

I confirmed that I have a download folder on the C: drive.

Powershell3

I hit Enter. The file downloads.

Powershell4

I run the file in C:\download.

WMFCoreSetup

Good news! I already have the Core installed!

Run Windows Update

I also checked Windows Update in Internet Explorer to see if I was missing an update.

WindowsUpdate

I have not updated this server in almost 10 years. It is taking a long time to check!

I expect this to be the situation for most companies, too.

WindowsUpdate1

But I know that a process is running. I checked Windows Task Manager.

TaskManager

The process completes almost an hour later.

WindowsUpdate2

I will install the first updates.

WindowsUpdate3

I click on Review and install updates.

ReviewAndInstallUpdates

Then I click on Install Updates.

InstallingUpdates

The installation completes and I click on Restart Now.

InstallationComplete

The Windows 2003 Server restarts.

I check Windows Update again. This time I will install all high-priority updates. There are 169 updates to apply. I expect this to take an hour or more.

SelectPriorityUpdates

Note: Update 17 required me to install Internet Explorer 8. I had to click on some dialog boxes.

I restart the VM after the installation completes.

Set Windows configurations for Azure

On the virtual machine you plan to upload to Azure, run all the following commands from the command prompt window with administrative privileges. In this case, I run as Administrator.

RunAs

CommandPrompt

Change to the C:\windows\system32 directory.

Remove any static persistent route on the routing table:

To view the route table, run route print from the command prompt window.

Check the Persistence Routes sections. If there is a persistent route, use route delete to remove it. The VM has none because I changed the IP settings to obtain an IP address automatically.

RoutePrint

Remove the WinHTTP proxy:

netsh winhttp reset proxy

netsh

Set the disk SAN policy to Onlineall.

diskpart

 san policy=onlineall

 exit

diskpart

Set Coordinated Universal Time (UTC) time for Windows and the startup type of the Windows Time (w32time) service to Automatically:

REG ADD HKLM\SYSTEM\CurrentControlSet\Control\TimeZoneInformation /v RealTimeIsUniversal /t REG_DWORD /d 1

Enter Yes and hit Enter

sc config w32time start= auto

scconfig

Set services startup to Windows default values

Make sure that each of the following Windows services is set to the Windows default values. To reset the startup settings, run the following commands:

sc config bfe start= auto

sc config dcomlaunch start= auto

sc config dhcp start= auto

sc config dnscache start= auto

sc config IKEEXT start= auto

sc config iphlpsvc start= auto

sc config PolicyAgent start= demand

sc config LSM start= auto

sc config netlogon start= demand

sc config netman start= demand

sc config NcaSvc start= demand

sc config netprofm start= demand

sc config NlaSvc start= auto

sc config nsi start= auto

sc config RpcSs start= auto

sc config RpcEptMapper start= auto

sc config termService start= demand

sc config MpsSvc start= auto

sc config WinHttpAutoProxySvc start= demand

sc config LanmanWorkstation start= auto

sc config RemoteRegistry start= auto

I saw multiple messages stating that the specified service does not exist; but I continued with the process.

scconfig1

scconfig2

Update Remote Desktop registry settings

If there are any self-signed certificates tied to the Remote Desktop Protocol (RDP) listener, remove them:

REG DELETE “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\SSLCertificateSHA1Hash”

This registry key did not exist on my VM.

RegDelete

For more information about configuring certificates for RDP listener, see Listener Certificate Configurations in Windows Server

Configure the KeepAlive values for RDP service:

REG ADD “HKLM\SOFTWARE\Policies\Microsoft\Windows NT\Terminal Services” /v KeepAliveEnable /t REG_DWORD  /d 1 /fREG ADD “HKLM\SOFTWARE\Policies\Microsoft\Windows NT\Terminal Services” /v KeepAliveInterval /t REG_DWORD  /d 1 /fREG ADD “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\Winstations\RDP-Tcp” /v KeepAliveTimeout /t REG_DWORD /d 1 /f

KeepAlive

Configure the authentication mode for the RDP service:

REG ADD “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp” /v UserAuthentication /t REG_DWORD  /d 1 /fREG ADD “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp” /v SecurityLayer /t REG_DWORD  /d 1 /fREG ADD “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp” /v fAllowSecProtocolNegotiation /t REG_DWORD  /d 1 /f

authentication.jpg

Enable RDP service by adding the following subkeys to the registry:

REG ADD “HKLM\SYSTEM\CurrentControlSet\Control\Terminal Server” /v fDenyTSConnections /t REG_DWORD  /d 0 /f

EnableRDP

Configure Windows Firewall rules

The steps for later versions of Windows server are very different because of a new and improved version of Windows Firewall. It is relatively easy on a Windows 2003 Server.

First, I confirmed that Windows Firewall was on.

ConfirmFirewall

I made sure that the Remote Desktop setting in Windows Firewall on my Windows 2003 server is set to TCP 3389 with Scope set to “Any”.

Firewall3389

Service3389

Run PowerShell as an administrator.

Run the following command in PowerShell to allow WinRM through the firewall and enable PowerShell Remote service.

Enable-PSRemoting –force

Powershell5

Verify VM is healthy, secure, and accessible with RDP

I do not have a way to confirm that the Windows Management Instrumentation (WMI) repository is consistent. That is, there is no command that I can run on Windows 2003 Server. If the repository is corrupted, see the blog post WMI: Repository Corruption, or Not?

I cannot run the bcdedit commands to set the Boot Configuration Data (BCD). I am listing them below so that you know what they are.

bcdedit /set {bootmgr} integrityservices enable

bcdedit /set {default} device partition=C:

bcdedit /set {default} integrityservices enable

bcdedit /set {default} recoveryenabled Off

bcdedit /set {default} osdevice partition=C:

bcdedit /set {default} bootstatuspolicy IgnoreAllFailures

Remove any extra Transport Driver Interface filters, such as software that analyzes TCP packets.

To make sure the disk is healthy and consistent, run the CHKDSK /f command in the command prompt window. Type “Y” to schedule the check and restart the VM.

chkdsk

windows2003chkdsk

Uninstall any other third-party software and driver related to physical components or any other virtualization technology.

I uninstall the VMWare Tools. I restart the VM as required.

Note: This is likely where I lose my network adapter since it is a VMWare network adapter.

VMWareTools

I click Cancel when the Found New Hardware Wizard appears.

FoundNewHardware

Regardless, a new device was installed. I click Yes to restart my VM again.

SystemSettingsChange

Make sure that a third-party application is not using Port 3389. This port is used for the RDP service in Azure. You can run netstat -anob in the command prompt window to see the ports that are used by the applications.

It looks like TermService (svchost.exe) is using Port 3389. This is the Remote Desktop Service.

netstat

If the Windows VHD that you want to upload is a domain controller, follow these extra steps to prepare the disk.

Reboot the VM to make sure that Windows is still healthy and can be reached by using the RDP connection.

I check that my Administrator account has the right to logon onto the server via Remote Desktop.

LocalSecuritySettings

I will add my Administrator account.

AllowLogon

I click Add User or Group. I enter Administrator and click on Check Names.

SelectObject

I click OK to continue. The Administrator name appears in the list.

AllowLogon2

I click OK to continue.

LocalSecuritySettings2

I am certain that I have no network connection in the VM now. Later I will use Hyper-V Integration Services to enable network connectivity.

Shut down the VM!

 

Convert the VMWare VMDK to Hyper-V VHD

Microsoft offers a VMWare VM conversion kit: http://www.microsoft.com/en-us/download/details.aspx?id=42497

I tried this kit; but afterwards I was unable to connect to the VM on Azure using RDP. I think it is because I lost the VMWare VM network adapter. Maybe the conversion kit works; but I need to run the converted VHD file in Hyper-V and add a network adapter. That is what I am doing next; but using a different tool for the conversion from VMWare to VHD.

Convert the VMWare VMDK to Virtual PC VM

I still do not have a network adapter in the VMWare VM. I will add one later.

I downloaded and install StarWind V2V Converter 8.0.167. StarWind Software V2V Image Converter is a virtual machine conversion utility. The package can convert many existing virtual machine formats to others.

I convert the VMWare Workstation VM disk to a Virtual PC pre-allocated image. The series of screenshots that follow show the steps that I followed.

StarWind1

StarWind2

StarWind3

StarWind4

StarWind5

I have successfully converted my VMWare VM to a MS Virtual PC VM.

Enable Hyper-V

My current setup does not have Hyper-V configured. However, I do have Windows 10 Pro installed with Intel(R) Core(TM) i7-3840QM CPU @ 2.80GHz processors. So I am able configure Hyper-V.

Hyper-V1

I have to configure my Windows 10 computer to support Hyper-V and Virtual PC. I followed the steps in the following blog posts:

https://www.groovypost.com/howto/create-virtual-machine-windows-10-hyper-v/

https://docs.microsoft.com/en-us/virtualization/hyper-v-on-windows/quick-start/enable-hyper-v

More information on Hyper-V running on Windows 10:

https://www.tenforums.com/tutorials/2087-hyper-v-virtualization-setup-use-windows-10-a.html

Hyper-V2

Hyper-V Manager

For more information on Hyper-V running on Windows 10:

https://www.tenforums.com/tutorials/2087-hyper-v-virtualization-setup-use-windows-10-a.html

I run Hyper-V Manager.

First, I create a virtual switch connected to the External network. See the steps documented in Part Three in the URL above.

Second, I create a new VM.

Hyper-VManager1

Hyper-VManager2

Hyper-VManager3

Assign 3032MB of memory (or however much you need to allocate). It may change when you select your Azure VM series.

Hyper-VManager4

MISSING A SCREEN HERE

I chose the virtual switch for my network configuration that I created as a first step.

Hyper-VManager6

Hyper-VManager7

New VM appears in Hyper-V Manager

Hyper-VManager8

Start the VM

Windows2003starting

It is running, but needs a network connection.

Windows2003runing

Installing a Network Adapter

Map the CD drive to vmguest-HyperV.iso. If you do not have the file, you have to find it somewhere on the Internet.

Click OK to upgrade the HAL.

HALupgrade

Installation starts

Hyper-Vdetect

Click Yes to restart

Hyper-Vrestart

Installation continues after reboot and logging in

Hyper-Vrinstall

Click Yes to restart

Hyper-Vrestart2

After restart, additional settings are applied.

Windows2003applying

Finally, I have a Local Area Connection again! The Microsoft Hyper-V Network Adapter is installed.

ServerLAN

serverVM

Shut down the VM!

That concludes this blog post. Part two of this two part blog posting will continue with uploading the VHD to Azure.

Using CSS Sprites to Improve Web Page Loading

This is a little bit off topic from what I have blogged about recently. But I recently had an opportunity to work with CSS sprites.

This is a copy of the blog post that I originally posted here:

Using CSS Sprites to Improve Web Page Loading

I was developing a new web page for an intranet web application. The web designer recommended that the key blocks of content be wrapped in boxes with distinct border colors and rounded corners. The image below shows the key parts of that recommendation. The boxes had to be able to resize dynamically based on the amount of content and the size of the web browser. Below is an image of one of the squares with rounded corners.

image

The original web page contained separate images for each border and corner image. That resulted in 43 images for the page as it appears above:

  • 20 corner images
  • 20 border images
  • 1 image for the exclamation mark
  • 2 images for the arrows

The download time for the web page was very high during performance load testing. So I made changes to use CSS sprites. First, I replaced the 20 border images with border calls that appeared in the CSS as below.

{background-color:#FFFFFF; border:2px solid #FF9900;}

Then I merged all of the images together into one image using the CSS Sprite Generator site at http://csssprites.com/index.html. Below is a sample image of four corner images merged into one image file.

image

As a result, I went from 43 http requests to only one. That was a huge improvement; but now I had to figure out how to use one image file as the source for all of the remaining images. I added the CSS rounded corners for each of the blocks. This was the tricky part because the images still had to be able to position themselves dynamically.

.roundedBox {position:relative; padding:17px; margin:10px 0;}
.corner {position:absolute; width:20px; height:20px; no-repeat;}

I then defined which image to use in which corner for each of the “squares”. Below is the CSS for one of the squares. Note that background-position is used to define the starting point of the image that I want to use in each corner.

.topLeft {top:0; left:0; background-position:-0px -294px;}
.topRight {top:0; right:0; background-position:-0px -315px;}
.bottomLeft {bottom:0; left:0; background-position:-0px -252px;}
.bottomRight {bottom:0; right:0; background-position:-0px -273px;}

Next, I defined the square using the CSS. Note the reference to the sprites.gif image. The CSS for each square refers to the same image file. I also define the border size and border color in the first line of the CSS.

#roundedsquare {background-color:#FFFFFF; border:2px solid #FF9900;}
#roundedsquare .corner {background-image:url(./images/sprites.gif);}
#roundedsquare .topLeft {top:-2px; left:-2px;}
#roundedsquare .topRight {top:-2px; right:-2px;}
#roundedsquare .bottomLeft {bottom:-2px; left:-2px;}
#roundedsquare .bottomRight {bottom:-2px; right:-2px;white-space:nowrap;}

I add the div tags to the web page. I essentially wrap the html around with the div tags. The class statements reference the CSS styles that I created.

<div class="roundedBox" id="roundedsquare">
        <— add html here —>

        <div id="cornertd" style="display:block">
               <div class="corner topLeft"></div>
               <div class="corner topRight"></div>
               <div class="corner bottomLeft"></div>
               <div class="corner bottomRight"></div>
        </div>
</div>

The result is that the html is surrounded by a square with rounded corners. The images used for the rounded corners line up with the borders. The images and borders also use the same color and background.

image

I found it to be very simple after applying CSS sprites the first time. I did have to sometimes adjust the positioning of the images within the sprite image. That is, I learned to give them more space so as not to overlap when set on the web page.

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 2 of 2

I am ready to start the Lotus Notes connector on the SharePoint 2013 Server. In this second of two blog entries, I execute the steps to:

  • Start the Lotus Notes Connector service
  • Register Lotus Notes with the service
  • Setup and start the crawler
  • Setup, start, and configure the metadata service
  • Create a Search Center site
  • Refine the Lotus Notes search results

This is a copy of the blog post that I originally posted here:

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 2 of 2

I continue to follow along the steps as detailed here: http://technet.microsoft.com/en-us/library/jj591606.aspx. However, I depart from the steps and do things a bit differently based on where I ran into problems and fixed them. I document everything in detail.

RHR (2013-05-05): I had to make a few corrections in this blog posting. I corrected the order below. Registering Lotus Notes with the Server has to be completed before starting the Lotus Notes Connector Service.

Register Lotus Notes with the Server

I open Explorer and click through the folders until I get to: C:\Program Files\Microsoft Office Servers\15.0\Bin\1033

image_thumb[110]

I double-click on the NotesSetup file. The Index Setup Wizard screen appears. I click Next to continue.

image_thumb[112]

The Register Lotus Notes screen appears. I enter the data as seen below. I enter the password for the Notes ID that I used to run the Lotus Notes client software. I do not check the Ignore Lotus Notes security while building an index checkbox. I click Next to continue.

Location of notes.ini file

C:\Lotus\Notes\notes.ini

Location of Lotus Notes install directory

C:\Lotus\Notes

image_thumb[163]

The Specify Lotus Notes Owner Field screen appears. I enter the details as seen below. I click Next to continue.

Lotus Notes server name 192.168.1.88
Lotus Notes database file name Mapping.nsf
View name Mappings
Lotus Notes field name column title UserID
Windows user name column title DomainAccount

 

 

 

 

 

image_thumb[118]

The Completing the Lotus Notes Index Setup Wizard screen appears. I click Finish to continue.

image_thumb[120]

The Microsoft SharePoint Server Configuration screen appears. The configuration succeeded!

image_thumb[165]

Start the Lotus Notes Connector Service

Open Central Administration and click on Manage services on server

image

Click Start on the Lotus Notes Connector service

image

The Lotus Notes connector settings screen appears. I select Create new application pool and enter Contoso Lotus Notes Crawl App Pool. I keep the security account and click Provision.

image

The Working on it message appears briefly

image

The Services on Server screen appears and the Lotus Notes Connector service has started.

image

 

Setup and Start the Crawler

I return to Central Administration and click on Manage service applications.

image_thumb[167]

Click on the search service application. I named mine Search Service Application.

image

The Search Service Application: Search Administration screen opens.

image

Click on Content Sources

image_thumb[173]

The Search Service Application: Manage Content Sources screen appears. I started some work on crawling HTML files. Maybe I cover that in a future blog entry.

image

The Search Service Application: Add Content Source screen appears. I enter Lotus Notes Application as the Name. I select Lotus Notes as the type of content to be crawled.

image

I set the Start Addresses to point to the Notes database that I opened earlier. I could add other Notes databases here, too. I also set the Crawl Settings, Crawl Schedules, and Content Source Priority. I click OK to accept the settings and continue.

image

The new content source appears in the Manage Content Sources screen.

image

Setup, Start, and Configure the Metadata Service

I created a Metadata service while doing some other work. The steps to create one are listed below.

I return to Central Administration and click on Manage service applications.

image_thumb[185]

Click on New in the top left corner of the screen and Managed Metadata Service.

image_thumb[187]

The Create New Managed Metadata Service screen appears. I enter a Name and Database Server.

image_thumb[190]

I scroll down the screen. I select Use existing application pool and keep the remaining settings as is. I click OK to accept the edits and continue.

image_thumb[192]

The Managed Metadata services now appears in the list.

image_thumb[194]

I return to Central Administration and click on Manage services on server.

image_thumb[197]

Click on Start on the Managed Metadata Web Service.

image_thumb[199]

The service starts shortly afterwards. I return to Central Administration and click on Manage service applications.

image_thumb[200]

I click on Managed Metadata in the list of service applications.

image_thumb[202]

I had a problem one time with an error message appearing.

The Managed Metadata Service or Connection is currently not available. The Application Pool or Managed Metadata Web Service may not have been started. Please Contact your Administrator.

I checked that the Contoso App Pool application pool was started in IIS.

image_thumb[204]

I waited about 20 minutes and the message stopped appearing. I considered doing an IIS Reset; but it was not necessary. The Site Settings: Term Store Management Tool screen appeared.

image_thumb[206]

The service starts shortly afterwards. I return to Central Administration and click on Manage service applications.

image_thumb[207]

I click on my Search Service Application service application.

image

I then click on Content Sources on the left.

image_thumb[211]

The Search Service Application: Manage Content Sources screen appears. I click on Lotus Notes Application and then on Start Full Crawl.

image

I click OK on the message screen.

image_thumb[215]

The final status update appears as below after the crawling completes.

image

Create a Search Center Site in SharePoint Server 2013

I created a new Search Center site by executing the steps outlined here: http://technet.microsoft.com/en-us/library/hh582314.aspx

Refining the Lotus Notes Search Results

I open my Search Center site at http://server2012sp/sites/SearchCenter. I enter the word sharepoint in the search window and click on the search icon. The search results appear as below. I blanked out the last part of the last names of the authors. Also, you can see that the Notes document UNID is displayed as the document title.

image_thumb[221]

I found a blog entry (http://blogs.msdn.com/b/opal/archive/2010/02/16/crawl-lotus-domino-with-lotus-notes-connector-in-sharepoint-server-2010.aspx) that described how to replace the document UNID with the document Subject field. The steps to execute were for SharePoint 2010; but I’m going to show the steps for SharePoint 2013. I suspect that they are virtually the same anyway.

I return to Central Administration and click on Manage service applications.

image_thumb[222]

I click on my Search Service Application service application. I then click on Search Schema.

image_thumb[223]

The Search Service Application: Managed Properties screen appears. I click on Categories.

image

The Search Service Application: Categories screen appears. I click on Notes.

image

I want to use the Subject field for the document title. The Subject field appears to be the best match for the title based on the design of the form. The Subject appears at the top of the Lotus Notes form and a review of the data makes it seem to be the best match. The steps that follow will show how I implement using the Subject field.

image_thumb[243]

The Search Service Application: Crawled Properties – Notes screen appears. I enter Subject in the Crawled properties field and click on the green arrow box image_thumb[237]. The list of property names is filtered to include only those that contain the term Subject.

image

I click on OriginalSubject and then click on Edit/Map Property.

image

The Search Service Application: Edit Crawled Property: Subject screen appears. I can add mappings to the managed property on this screen.

image

I click on Add a Mapping. The Managed property selection screen appears. I scroll down the Select a managed property and select Title(text).

image

I click OK at the bottom of the screen (scrolling down a bit). Title(text) appears in the mapping list.

image_thumb[247]

I click OK to close the Search Service Application: Edit Crawled Property: Subject screen. I am returned to the Search Service Application: Crawled Properties – Notes screen appears. I enter Subject in the Crawled properties field and click on the green arrow box image_thumb[249]. The list of property names is filtered to include only those that contain the term Subject. I can see that Title is listed in the Mapped To Property column for Subject.

image

I click on Title. The Search Service Application: Edit Managed Property – Title screen appears.

image

I scroll down to the Mappings to crawled properties section and select Subject in the list.

image

I click on Move Up to move Subject to the top of the list. By moving Subject to the top, I am making sure that it is the first to be picked up and mapped to the Title. Otherwise, the document UNID will be displayed. I click OK to accept the changes and close the screen.

image

I click on Content Sources on the left side.

image_thumb[259]

The Search Service Application: Manage Content Sources screen appears. I click on Lotus Notes Forum and then click on Start Full Crawl.

image_thumb[261]

I click on OK on the confirmation prompt.

image_thumb[263]

The crawling starts.

image

The crawling completed after almost 14 minutes.

image

I open my Search Center site at http://server2012sp/sites/SearchCenter. I enter the word sharepoint in the search window and click on the search icon. One of the search results appear as below. You can see that the Notes document Subject is displayed as the document title.

image

I have to add a Server Name Mapping for my Domino server. My Domino server uses port 8080 instead of port 80. I click on Server Name Mappings in the Search Service Application screen.

image

The Search Service Application: Server Name Mappings screen appears. I click on New Mapping.

image

The Search Service Application: Add Server Name Mappings screen appears. I add the name mapping as it appears below. I click OK to continue.

image

The mapping is displayed on the screen.

image

I run another full crawl on the Lotus Notes Application content source. I refresh the Search Results screen and the correct port number appears in the URL.

image

I click on the search result and the web page opens as expected.

image

I have an idea about building a search drive solution using the data stored in Lotus Notes databases. I saw some interesting solutions by a few SharePoint MVPs.

For now, this concludes my blog entry on installing and configuring the Lotus Notes Connector for SharePoint 2013. I hope that you found it helpful. You can read my other blog entries on how I installed and configured my SharePoint 2013 server environment.

 

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 1 of 2

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 2 of 2

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 1 of 2

I am involved with Lotus Notes application migrations to SharePoint from time to time. I am looking for a solution that can keep some legacy Lotus Notes applications in place; but provide search capabilities to the legacy data to SharePoint 2013 users. This could be a very practical and cost-effective business solution to customers.

This is a copy of the blog post that I originally posted here:

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server- Part 1 of 2

I want to crawl a Notes database with SharePoint 2013 Search. As part of that effort, I need to install the Lotus Notes client and connector on the SharePoint 2013 Server. But there are a number of steps that I need to execute first. I follow along the steps as detailed here: http://technet.microsoft.com/en-us/library/jj591606.aspx. However, I depart from the steps and do things a bit differently based on where I ran into problems and fixed them. I document everything in detail.

In this first of two blogs, I execute the steps to:

  • install and configure the Lotus Notes client software
  • grant access privileges to the Notes \ data folder
  • install the Notes C++ API
  • create the Mappings database
  • add a user account mapping to the Mappings database

I am logged onto my SharePoint 2013 Server with the SP_Install user account. The SP_Install user account is a member of the Administrators group on this server.

Installing the Lotus Notes Client Software

I am installing the 32-bit version of Lotus Notes R8.5 on my SharePoint 2013 server. I will only install the client and not the Designer or Administrator software. I would typically recommend against ever installing such software on a server. But I don’t have access to a third party connector (e.g. BA-Insight). I have installed Lotus Notes client software more than a hundred of times already. I will touch on the key points.

I double-click on the installation file.

image

I click Yes on the User Account Control window.

image

I change the file path in the InstallShield Wizard and click Next

image

The files are extracted and the installation process begins. The Install Wizard screen appears. I click Next to continue.

image

I accept the terms in the License Agreement screen and click Next to continue.

image

I enter generic text in the Customer Information screen and click Next to continue.

image

I keep the selections in the Installation Path Selection screen and click Next to continue. I actually tried it with the default file paths first and everything worked until I got to the last step of the Lotus Notes Index Setup Wizard. I could not complete the step. But simplifying the default file paths made it work.

Program Files C:\Lotus\Notes\
Data Files C:\Lotus\Notes\Data\

image

I modify the settings in the Custom Setup screen as seen below. I do not select the Domino Designer or the Sametime Client to be installed since I already have it installed on another virtual machine. I really do not want them installed on a server anyway. I click Next to continue.

image

I deselect all of the options on the Ready to Install the Program and click Install to continue.

image

The Install Wizard continues to install until the Install Wizard Completed screen appears. I click Finish.

image

Grant Permissions on the Data Folder

I open Explorer to the Notes data folder.

image

Right-click on the data folder and click on Properties.

image

Click on the Security tab on the Properties window.

image

Click on the Edit button.

image

The Permissions for Data screen appears. Click on Add…

image

The Select Users screen appears. Change the location to the local server and enter WSS_WPG into the object names field and click Check Names.

image

The group name should resolve correctly. Click OK to accept and continue.

image

The WSS_WPG group should appear in the security list. Click the Allow checkbox on Full control under permissions. Click OK to accept the changes and continue.

image

The WSS_WPG group appears in the Security list in the Data Properties screen. Full Control permissions are granted. Click Close to close the screen.

image

Install Lotus C++ API

I downloaded the Lotus C++ API from here: https://www14.software.ibm.com/webapp/iwm/web/reg/download.do?source=ESD-SUPPORT&S_PKG=CR3DNNA&S_TACT=104CBW71&lang=en_US&cp=UTF-8

I chose the Release 3.0 for Windows file.

image

I executed the c55svna.exe file on the SharePoint 2013 server where the Lotus Notes client will be installed. I clicked on Run in the Open File – Security Warning screen.

image

The Lotus C++ API screen appeared. I clicked Finish to continue.

image

I clicked Yes on the Create directory screen.

image

I clicked OK on the Extraction completed screen.

image

Copy the lcppn30.dll file from the C:\notescpp\lib\mswin32 folder to the C:\Program Files\Microsoft Office Servers\15.0\Bin folder.

RHR (2012-05-05): I fixed a typo below. I referenced the wrong Lotus Notes folder to copy to.

Copy the lcppn30.dll file from the C:\notescpp\lib\mswin32 folder to the C:\Lotus\Notes folder.

image

Update the Path Environmental Variable

I am not convinced that I need to update the Path Environmental Variable; but it has solved problems for me in the past when I use the Lotus C++ API from IBM.

I open Explorer and right-click on Computer. I click on the Properties option on the menu.

image

The System Properties screen appears. I click on the Advanced tab.

image

RHR (2012-05-05): The Control Panel \ System and Security \ System screen may appear. If it does, click on Advanced system settings next.

image

Click on Environment Variables and scroll down in the System Variables and select Path.

image

Click on Edit and the Edit System Variable screen appears. Append the following text to the end of the Variable value field. Then click OK to accept the change and continue.

;C:\Lotus\Notes;C:\notescpp\lib\mswin32

image

Click OK on the Environment Variables screen.

Click OK on the System Properties screen.

Configure the Lotus Notes Client Application

I confirm that my Domino server is running on another virtual machine. The server name is Litwaredemo/litwareinc  and the IP address is 192.168.1.88. The Lotus Notes ID that I plan to use is a member of the applicable administrators groups on the Domino server.

I click the Windows Start button on my keyboard and then click on the Lotus Notes 8.5 icon on the workspace.

image

The Client Configuration screen appears and I click Next to continue.

image

The Client Configuration screen appears and I enter the details as seen below. I click Next to continue.

image

The Client Configuration screen opens. Click Next to continue.

image

The Domino Server Network Information screen appears. I enter the settings as seen below and click Next to continue.

image

The Notes ID File screen appears. I browse to where my Notes ID is stored and click Next to continue.

image

I click Yes in the IBM Lotus Notes screen that appears. I do want my Notes ID file copied to the data directory.

image

I enter the password in the Lotus Notes password screen and click Log In to continue.

Note: Never create this Notes ID without a password! Anyone with access to the SharePoint server could then gain full access to the databases on the Domino server.

image

The Lotus Notes client successfully connects to the Domino server.

image

The Client Configuration screen appears. I leave the settings as is and click Next to continue.

image

Another prompt appears. I check the checkbox in the bottom of the prompt and click No to continue.

image

The Lotus Notes 8.5 Getting Started workspace appears.

image

I keep the Lotus Notes client running in preparation for the next steps.

Verify Access to the Lotus Domino Database that You Want to Crawl

I click on File \ Open \ Lotus Notes Application on the menu bar.

image

The Open Application screen appears. I select my Domino Server (LITWAREDEMO/litwareinc) in the drop down list and click through the folder structure until I see the database that I am looking for. I select it and click Open.

image

I wait for the Lotus Notes database to open. A Create Cross Certificate prompt appears and I click Yes to accept it.

image

The About Database document document appears in the workspace. I click the small x on the tab to close it.

image

The database opens in the workspace. It is opened to the default view named Threaded.

image

I also see a message in the view. The message indicates that the view is still being updated.

image

Eventually, the view completes updating and displays data.

image

I can test access to more Notes databases if I want to. I close the Lotus Notes client software after I completed testing. I click on the X in top right hand corner.

image

I click Yes to exit from Notes when prompted.

image

Create the Lotus Notes Mappings Database

I created the Mappings database on another virtual machine where I have the Lotus Notes Designer client installed. I opened the Lotus Notes Designer client and created a Notes database and named it “Mappings”. I then created a new form named “Mapping”. I added two fields and some labels as seen below.

image

I did change the Window Title of the form to “Mapping”.

image

I created a view named “Mappings”. The View Selection formula is: SELECT Form="Mapping". I add a column for each field. I sort the first column in ascending order. The view design appears as below:

image

I removed the default view that came with the design. Only the Mappings view appears in the view design list now.

image

Add User Accounts to the Mappings Database

I open the Mappings database in the Lotus Notes client and click on Create \ Mapping.

image

A blank Notes document appears as below:

image

I enter the names as below to create a new mapping.

image

Lotus Notes User ID: LitwareInc\System Administrator
Windows User:

contoso\Admin

I click on the save icon image and then close the document by clicking on File \ Close.

image

The new document appears in the Mappings view.

image

I close the Notes client on my virtual machine. I copy the Mappings database file from my Notes \ data folder to a shared folder. I then copy the file from the shared folder to the lotus \ Domino \ data folder on the Domino server. Not to the Notes client on the SharePoint server!

Restart the Server

I have the Search components installed on my SharePoint 2013 server. I have to restart the server before proceeding with the next steps.

This concludes part 1 of 2 of my blog entries on installing and configuring the Lotus Notes Connector for SharePoint 2013. I hope that you found it helpful. Part 2 will provide the steps to complete everything to the point where you can get search results from your Notes databases. You can read my other blog entries on how I installed and configured my SharePoint 2013 server environment.

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 1 of 2

Installing and Configuring the Lotus Notes Connector for SharePoint 2013 Server: Part 2 of 2

Migrating Extracted Content to SharePoint 2010 using Kapow Design Studio

 

In a previous serious of posts, I described the steps taken to migrate Lotus Notes content from a Sandbox database to the local Kapow database. At this point, I could use Kapow robots to transform the local content in place. However, I am going to skip to the next step and migrate the content to a SharePoint 2010 list.

I am still running Kapow Design Studio.

SharePoint List

I created a simple list named “SandBox” in SharePoint 2010. I do not really need to copy content for all of the columns; but I am doing it anyway just as an example.

clip_image001

I modified the All Items view to show content more relevant to what I migrated.

clip_image003

Creating the migration robot

Select the “Lotus Notes Sandbox Migration” project.

clip_image004

Create a new robot

clip_image005

Enter “NotesExtraction.robot” as the robot name and click [Next] to continue.

clip_image006

You can leave the URL blank for now. Click [Finish] to continue.

clip_image007

Below is a screenshot of the robot we want to create.

clip_image009

Load Page Step

I create the first step by right clicking on the “x” in the robot view window.

clip_image010

Click on [Insert Step Before] in the drop-down list.

clip_image011

The new step appears as “Unnamed”.

clip_image012

Click on [Select an Action] and then on the [Load Page] option in the Action tab in the Step view.

clip_image013

Enter the URL from which you want to start loading content to. In this example, I start from a SharePoint list view.

http://win-q085cbbhm02/sites/TestTeamSite/Lists/SandBox/AllItems.aspx

Click on [Configure] beside “Options”. The Options dialog box opens in the workspace. I have a Username and Password entered because I set my SharePoint web application to not allow anonymous access. Click [OK] to close the Options dialog box after setting the credentials.

clip_image014

Click on the “x” in the robot view window after you enter your URL.

clip_image010[1]

The contents of the web page at the URL should load in the Windows view for you to see.

clip_image016

The HTML in the web page is displayed in the HTML view. Note that I have the “Ignore Styles” checkbox checked.

clip_image018

You can select HTML tags in either window and see the selection in both windows.

Query the Database

I need to query the database where all of the content was extracted to. Right-click on the “x” in the robot view window and click on “Insert Step Before”.

The new Click step should appear as below. Note how the current tags that were defined earlier continue be used. The new Step should appear. Update the step on the Action tab as seen below. Database is set to the objectdb and a SQL statement is added. Note that the Post variable is added to Variables. Then the variables are mapped in the Variables Map section.

Step

Tag Finders

Action

Query Database

clip_image020

clip_image022

Next, I need to get to the current tag. I created the table below to show how I got to the correct starting tag. Note how I name a tag in the “Name” field in the Action of step 1. Then I reference the tag name in the “In this Tag” field in the Tag Finders of step 2. This process repeats itself in each step. Only the first step refers to “Anywhere in Page” for the “Find Where” value for Tag Finders. All of the remaining steps refer to “In Current Tag”. That is how I drilled down through the HTML to get to the correct starting tag.

Note also that the first tag starts as seen in the image below. The image shows some content that I added in testing the migration robot. You should have the “Ignore Styles” checkbox checked to see the HTML without styles.

clip_image024

 

Step

Tag Finders

Action

1

clip_image026

clip_image028

2

clip_image030

clip_image032

3

clip_image034

clip_image036

The final start tag step should place the starting point as seen in the image below. This basically places us close to where the [Add New Item] tag is.

clip_image038

Scroll down in the HTML window and look for the “Add new item” text. Right-click on it and select “Click” on the drop-down menu.

clip_image039

The new Click step should appear as below. Note how the current tags that were defined earlier continue be used.

Step

Tag Finders

Action

Click

clip_image041

clip_image043

I had to enter the credentials again by clicking on [Configure] beside Options. I clicked OK when I was done.

clip_image044

Right clicking on the “x” in the robot view window.

clip_image010[2]

This will load the “New Item” screen. Scroll down the HTMO window named “Unnamed (2)” or something like that. It should be the newest window tab displayed. Look for the HTML fields as seen in the image below.

clip_image046

I created a series of steps to enter the text values into the HTML fields. Basically, I right-click in each field and select the “Enter Text” option to create each step. I select the applicable item from the Post variable for “text to enter”.

clip_image047

As you create each step, you should see the text inserted into the applicable field.

clip_image048

I created the following steps.

Step

Tag Finders

Action

Enter Title

clip_image050

clip_image052

Enter Unique Id

clip_image054

clip_image056

Enter File Name

clip_image058

clip_image060

Enter File Size Type

clip_image062

clip_image064

Enter File Size

clip_image066

clip_image068

Enter Platform

clip_image070

clip_image072

Enter Release

clip_image073

clip_image075

Enter Product

clip_image077

clip_image079

Enter Category

clip_image081

clip_image083

Enter Submitted By

clip_image085

clip_image087

Enter Description

clip_image089

clip_image091

Enter URL

clip_image093

clip_image095

Enter Url File Attachment

clip_image097

clip_image099

Grouping Steps

I grouped the steps by first selecting them with the mouse in the robot editor view.

clip_image101

Then click the [Group] button in the action bar. Enter a name “Enter Item Values” for the group and hit [Enter] on your keyboard.

clip_image102

You can now collapse and expand the group in the robot editor view by clicking on the [-+] button in the group.

clip_image103

Selecting File Attachment

Scroll down the HTML window until you see the section for adding a file attachment. Right-click on the Browse button and field so that both are selected. Click on the “Select File” option in the drop-down menu.

clip_image105

The new Select File step should appear as below. Note that you need to set file selection options to find the correct file. The original extraction robot extracted file attachments to the local file system in the C:\Temp directory. The filename string value was stored in the Post object.

Step

Tag Finders

Action

Click

clip_image107

clip_image109

I remember that some file attachments were missing from the extraction process. The error handling for this step should skip files that do not exist in C:\Temp.

clip_image110

Updated Screen

I should now see values in almost every field in the web page. At least, values for those fields that had corresponding values in the local database.

clip_image112

Right-click on the [OK] button and select the “Click” option. The new Click OK step should appear as below.

Step

Tag Finders

Action

Click

clip_image114

clip_image116

The error handling for this step should be the same as for the previous step.

clip_image110[1]

Right-click on the [Save] button and select the “Click” option. The new Click Save step should appear as below.

Step

Tag Finders

Action

Click

clip_image118

clip_image119

The error handling for this step should be the same as for the previous two steps.

clip_image110[2]

I had to enter the credentials again by clicking on [Configure] beside Options. I clicked OK when I was done.

clip_image044[1]

Click on the “x” in the robot view window to end processing.

clip_image010[3]

Save the robot.

Open a web browser to the SharePoint list and see if you added a new entry with the robot.

clip_image121

Click on the list item to see all of the details. All of the details from the original Notes database document are now in the SharePoint list. The URL’s still point to the Notes database. Normally, I would not have copied those over; but I did so here just as an example. I could also have run a transformation robot to update the URLs to point elsewhere.

clip_image123

I can open the file attachment by clicking on “wdpick.nsf” at the bottom of the page. Note that I did not recreate the confirmation page that existed in the Notes application. Instead, I attached the file attachment to the main document it belongs to. That was my choice to keep things simple here.

Also, I did not try to implement any document-level security that may have existed in the Lotus Notes application. I could also improve the user experience by creating a custom form for the list. I could also save the file attachments in a separate folder and link to them.

Conclusion

Many thanks to the Kapow support team for helping me through this process. I’m sure that they know better ways for some of the above steps. I did learn a lot using this hands-on approach. This effort was a great way to see how this content migration tool works.

Creating a Migration Robot in Kapow Design Studio – Part 5 of 5

Four more steps to add!

clip_image001

The hyperlink needs to be clicked to get to the “Download Agreement” web page. You may need to right-click it then click on the [Click] option. This creates a step that you can later delete.

clip_image002

You should see the “Download Agreement” web page.

clip_image004

Scroll down the web page until you see the “I have read and accept the terms and conditions” hyperlink.

clip_image005

Right-click on the hyperlink and select the [Extract \ Extract URL \ Post.URL] option.

clip_image007

A new step is created and it should appear like below.

Step

Tag Finders

Action

Extract File URL

clip_image009

clip_image011

Right-click on the “Extract File URL” step and select [Insert Step After].

clip_image012

A new step is created. Change the name of the step on the Basic tab to “Assign Variable” and it should appear like below.

Step

Tag Finders

Action

Extract File URL

clip_image014

clip_image016

The text in the Value text field appears as follows:

substring(Post.URLFileAttachment, indexOf(Post.URLFileAttachment, "$FILE/")+6, length(Post.URLFileAttachment)-12)

Right-click on the “Assign Variable” step and select [Insert Step After].

clip_image012[1]

A new step is created. Change the name of the step on the Basic tab to “Extract File Attachment” and it should appear like below.

Step

Tag Finders

Action

Extract File URL

clip_image018

clip_image020

This step is extracting the file attachment and saving it with the same filename to the C:\Temp directory.

Right-click on the “Extract File Attachment” step and select [Insert Step After].

clip_image012[2]

A new step is created. It should appear like below.

Step

Tag Finders

Action

Extract File URL

clip_image014[1]

clip_image022

The “Store in Database” step saves the record to the database for future use.

Grouping Steps

I grouped the steps by first selecting them with the mouse in the robot editor view.

clip_image024

Then click the [Group] button in the action bar. Enter a name for the group (e.g. Extract File Attachment) and hit [Enter] on your keyboard.

clip_image025

You can now collapse and expand the group in the robot editor view by clicking on the [-+] button in the group.

clip_image026

Save the Robot

Save the robot by clicking on File \ Save in the menu.

clip_image027

Next: I have to document and publish the steps to migrate the content that I just extracted to a SharePoint 2010 server. I have done the migration already; but I need to record it here.

Many thanks to the Kapow support team for helping me through this process. I’m sure that they know better ways for some of the steps. I did learn a lot using this hand-on approach.