Think about Cloud – the correct way

Recently I got a question from a friend – what good will moving to cloud give me, will it save my money?

Maybe or maybe not – it depends and it depends on how you look at your money/time correlation.

Moving to cloud and thinking about saving money first of all is very wrong. Cloud gives you reliability, capacity, high-availability, performance, flexibility, mobility, seamless experience, newest versions of hardware/software… Cloud is your possibility to make your company work better and more efficient. For example, what about cross-continent high-availability? What about five "9" availability? What about new hardware that is being released every year? Can you afford the downtime caused by faulty software update? Think how much effort you would need to accomplish these complex things on your own and those are there, in cloud, ready to be used and deployed, just a few mouse clicks away.

Apart from that, moving to cloud means that you do not have to think about hardware or software maintenance, patch management (countless hours of pretesting included), new versions, updates and upgrades and so on. You always get the newest hardware, newest software and almost all the work is done there already for you, particularly if you choose SaaS and PaaS. That means that your IT team does not waste the valuable time to problem solving. Instead, they can focus on innovation, focus more on things that will move your company forward in much shorter timeframe.

As you are already thinking about cloud the correct way, we can say some words about saving money too. There are many cases when cloud costs less than on-premises. For example, when you need a SharePoint or SAP development environment for your developers – why pay money for licenses? Why pay for the hardware or someone to construct that complex environment for you? Instead, you can do that all in minutes – in cloud (Azure is a good example here).
Another good case is a new company – building your own, albeit even a small sized datacenter costs a lot of money. Often cloud is way cheaper for such cases.

If you already have existing infrastructure, cloud may be a great place for your backup datacenter – particularly when modern cloud providers charge only for the amount of resources you use (e.g. you do not pay a cent for a turned off VM in Azure, you pay just for storage).

The topic is huge and we can talk a lot about it but I hope I touched enough in this short post to emphasize some of the key elements of cloud.

So think about cloud, but please, the correct way :)

Running Azure Stack inside a Virtual Machine in a home lab

Microsoft Azure Stack is a great product which will go GA at the end of the year but untill then we have first Technical Preview available which can be downloaded here.

Considering it is actually a whole cloud platform Azure Stack's minimum hardware requirements are quite hefty – 12 physical cores and 96 GBs of RAM. You wont find such amount of compute power in most PCs but there are some options, particularly for people who have access to relatively strong workstation-style PC – I did it myself with 20 GBs of RAM.

For the start you need Windows Server 2016 TP4 installation with virtualization features inside it. You can install it directly on your computer, use the .vhdx file included with Microsoft Azure Stack POC to boot directly from it or use the virtualization software with nested virtualization support (like I did it).

For Hyper-V, nested virtualization is available from Windows 10 Insider Preview 10565 – you can read the instructions to enable it here.

Otherwise you can also use other software, like VMware Workstation. For nested virtualization to work, you need to select "Hyper-V (unsupported)" as an OS in VM settings (there is also an option with editing the VM's .vmx file but let's keep it simple).

Please keep in mind that the method described below is not supported by Microsoft but when you need to learn while having access to the limited resources, you can improvise – it's more than acceptable.

As the first part is finished now we need to customize the installation files for Azure Stack, shrinking the official hardware requirements. After downloading and extracting the installation files go and mount "MicrosoftAzureStackPOC.vhdx" file to get write access to the content inside it.
Move to \AzureStackInstaller\PoCDeployment directory and open "Invoke-AzureStackDeploymentPrecheck.ps1" file inside it.
Now find this part of code:

function CheckRam {
    Write-Verbose "Check RAM."
   
    $mem = Get-WmiObject -Class Win32_ComputerSystem
    $totalMemoryInGB = [Math]::Round($mem.TotalPhysicalMemory / (1024 * 1024 * 1024))
    if ($totalMemoryInGB -lt 64) {
        throw "Check system memory requirement failed. At least 64GB physical memory is required."
    }
}

Change "64" in the line "if ($totalMemoryInGB -lt 64)" to whatever size you can give the Azure Stack. In my case I went for 20 GB. Save and close the file.

Now we need to edit the memory requirements for the infrastructure VMs. Go to the \AzureStackInstaller\PoCFabricInstaller directory and open the "PoCFabricSettings.xml" file for editing.

Find this line of code:

<Name>ADVM</Name>

Below it there is the configuration settings for the "ADVM" virtual machine. Modify the code to match this:

      <ProcessorCount>4</ProcessorCount>
      <RAM>1</RAM>  
      <MinRAM>1</MinRAM>  
      <MaxRAM>2</MaxRAM>  

If you would like you can also modify the "ProcessorCount" value. For me CPU count was not critical so I left it unmodified.
The settings we set here means that "ADVM" virtual machine will have a Startup Memory of 1 GB and Dynamic Memory feature turned on and configured with the minimum of 1 GB and maximum of 2 GB of RAM.

Now go and find every other VM configuration inside the file and modify it. Here is the full list of VM names:

ACSVM
ADVM
BGPVM
ClientVM
MuxVM
NATVM
NCVM
PortalVM
SQLVM
xRPVM

After modifications save and close the "PoCFabricSettings.xml" file and unmount the "MicrosoftAzureStackPOC.vhdx" file.

That's all. Now you can run "DeployAzureStack.ps1" and install Azure Stack as described in the official guide.

Limitations of Exchange ActiveSync and the need for improvement

Exchange ActiveSync is the industry standard when it comes to using Exchange email on mobile devices. But it comes with its shortcomings.

Recently we had business need for several people to use the shared calendar functionality of Exchange. Our colleagues needed the shared calendar to be integrated into their smartphone’s calendar along with their own one. Yes it works flawlessly in OWA and in Outlook but when it comes to ActiveSync – things get pretty hard.

None of today’s ActiveSync clients support shared calendar functionality. That counts even Windows Phone and Windows 10 Technical Preview Mail app. Actually the problem is in Exchange ActiveSync itself, which does not support that delegated functionality, by design.

If you need to have someone’s shared calendar in your smartphone there are two solutions:
1) Add the second ActiveSync account – you need to know someone’s password.
2) Use third-party app – giving your password to someone raises security concerns.
Neither of above are good solutions I think and you agree, I hope.

The functionality is hugely demanded everywhere though. There are assistants who need to have access to their boss’s calendar, there are shared corporate calendars and so on. And while today’s smartphones are so feature-rich devices, such limitations are huge letdown and hinder the otherwise great Exchange experience in mobile world.

There is not much time left until Exchange 2016 (supposedly that will be the name) comes to life, so this is the very time for Microsoft to fix the issue. Either ActiveSync needs to be improved and expanded to support the shared and delegated functionality of Exchange Server or we should have Outlook Anywhere client in the mobile world too which comes with fully flavored Exchange functionality.

On the other hand, I totally understand the need for compliance and the role of ActiveSync in that way. I think Microsoft it pushing ActiveSync heavily by including it in Outlook 2013 and Windows Mail App. And EAS honed over the years and gives pretty neat policy enforcement possibilities and functionality to secure the access to mailbox from any device, anytime.
If it is not possible for Microsoft to deliver the same on mobile devices with Outlook Anywhere client, then improving the ActiveSync functionality is the way to go.

Even if radical changes are needed, Microsoft should not be stopped from doing it as the Exchange experience should be full-featured across the board. Competitors like Apple and Google will quickly integrate the new functionality in iOS and Android as Exchange has huge user base and no one wants to lose customers.

whoami /all or First post and some about me

With more than 7 years of work experience in professional IT, I am now working in Ministry of Foreign Affairs of Georgia as the Head of Systems and Network Maintenance Division, System Administrator and Information Security Officer. Pretty much for one man, you might think, but that’s it.

I am heavily focused on Microsoft technologies, included but not limited to Windows Server, Exchange, SharePoint, Lync, SQL, System Center, IIS, TMG and others.
Aside from that I speak two virtual languages: Microsoft Hyper-V and VMware vSphere.

Certified since 2011, I passed certifications such as MCSA, MCP, MCITP, MCTS and currently I am working my way on completion of my MCSE certification paths.

I founded MCP-Way Blog a year ago but never was able to find enough time to operate it, untill today. The blog will mainly focus on interesting news, guides and things worth sharing from my everyday working life and experience. Hope I will be able to always find time to share something interesting with you.

You can find me at LinkedIn, Facebook and Twitter.

Though of course it all began with something:

My relationship with IT begins way back from 2001, when we bought the first PC in family and I fell in love with her. It was based on Willamette Pentium 4, equipped with 128 MB of SDRAM and 20 GB HDD. Pretty nice one for the time being.
At that time I was a 11 year-old kid, so I had plenty of time to learn.

In 2003 I joined CompInfo team – it was the first e-magazine in Georgia. Years in CompInfo, I think, were the most productive in my IT life – igniting an “IT fire” inside me. At that times I got my internet nickname – Power_VANO, which I retained and use even today. You can find me everywhere with that name.

In 2007 Overclockers.ge project emerged and I joined them in 2008. Currently I am the co-founder of modern Overclockers.ge community. This might be the beginning of my official IT career.

After that it was a year in advertising laboratory 919 – one of the best of its kind in Georgia. Great team with great projects and ideas and my role with them – web programmer and general IT Specialist.

Then I moved to EMIS and then to my current home – Ministry of Foreign Affairs of Georgia – as IT Support Specialist. After a year I switched to the position of System Administrator – what was after that you already know.