Monday, September 19, 2022

HP Spectre Convertable Laptop - restarts instead of shutdown

Had an old laptop to repurpse.  Reinstalled windows, drivers, etc.  Seemed to work fine except when I would shutdown the computer a minute later it would be sitting there on again.  Argh.

Looking online for the HP Spectre x360 convertible models this seems to be a common issue.  Found solutions with downgrading drivers (which does appear to work), etc.  

Then luckily I found this answer by DJElectron: SOLVED!!! Re: HP Spectre X360 15 2017 model does not shut do... - Page 2 - HP Support Community - 5978451

So, resetting the CMOS by doing the following fixed the issue.

  1. Shutdown the system by holding the power button
  2. Press and hold Win+V
  3. Power On System
  4. Once light on power button comes on wait 10 seconds then release Win+V
  5. Screen should say CMOS checksum invalid after a few moments. 
  6. Press Enter to reboot
Highly frustrating issue with an easy fix :)

Thursday, September 15, 2022

SQL to PowerApps without Premium Connector


We have an application (our primary one) that the licensing structure is based on active device connections.  Problem, we have more devices than licenses.  This isn't a big deal since due to scheduling not all devices are connecting at once.

But, when user 1 on device 1 goes home and forgets to close the app then user 2 on device 2 has an issue (provided the other licenses are already taken).  Closing out User 1 is fairly easy (reboot the computer), but not when there are 10+ workstations and they don't know which ones using the license.  It suddenly because a whack-a-mole game that they didn't want to play.


The license usage information is stored in a SQL table.  So if we can make this information accessible they can see exactly where licenses are used. 

Note: there are other ways to handle this, but they each had their own cons (ie auto logout of users on timer, etc).  For now, simply reporting on it was the best option to make the whack-a-mole game quicker.

With PowerApps we could access the SQL table directly and tell the users!  But wait, MS charges (IMO) a large amount of money for these types of connections.  Can we do it without a premium connection?  Yes, not as great, but it will do :)


The idea is to take this in multiple parts to end up with an end product that we put on the main SharePoint page so they can quickly see what workstations are up for whacking.

  1. Export the data from SQL
  2. Upload it to a SharePoint List
  3. Display it in a pretty format

1. Export from SQL 

First up we need to get the data out of SQL into a format we can use.  Also, we want to do this easily, I'm not interested in messing around all day with figuring out why I can't get the export to Excel to work or other variations.  PowerShell has a SQL Module and there's Export-CSV, so that's easy.

We'll need the SQL PowerShell module: Download SQL Server PowerShell Module - SQL Server | Microsoft Docs.  In my case I already had it as I added it when I built the server and installed SQL. had a blog on this with a number of options and well written.  Option 3 is what we are looking for How To Export SQL Server Data From Table To a CSV File (

Invoke-Sqlcmd -query 'Select WSID, ConnDate FROM mydb.myschema.connections;' -ServerInstance S-SQLDEV | Export-Csv -Path D:\Reports\connections.csv -NoTypeInformation

Great, now I have a PowerShell script that I can schedule with Task Scheduler to run every so often, let's say 10 minutes, and dump a csv file for me.  It's pretty lightweight so I'm not worried about the hit my SQL server will take from it.

Task Scheduler: 

Under Program/Script enter Powershell.exe

For the "Add arguments" enter -ExecutionPolicy Bypass "D:\Scripts\MyScript.ps1" (of course using your path and script name)

And the account that runs the Scheduled task needs read permissions to the SQL Database in question (or you will get blank results).

2. Upload to SharePoint List

Okay, now we need to get our lovely csv to a SharePoint list.  This is fairly easy, but there are some steps involved.  In particular, we need PowerShell v7 and PnP PowerShell Module (at least I went this route).  Also, I did this part on a different server as I didn't want to monkey around with uploading files to SharePoint on my SQL server.  Instead, I used another PS script to move the file from the SQL server to my file server and then uploaded from there (optional step below).

Optional: as just stated, I didn't want this part on my SQL Server so I copied the file to a File Server first. I did this by added the below to my PS script on the SQL Server.  (the service account used to run the scheduled task will need permissions at the remote location)

Copy-Item -Path "Microsoft.PowerShell.Core\FileSystem::D:\Reports\connections.csv" -Destination "Microsoft.PowerShell.Core\FileSystem::\\FileServerName\D$\Scripts\Connections\connections.csv"

I then proceeded with the rest of the steps on the File Server.

Installing PowerShell v7 is pretty easy... Installing PowerShell on Windows - PowerShell | Microsoft Docs

Optional: PSv7 doesn't include the ISE anymore.  

Now they encourage you to use Visual Studio Code with the PowerShell Extension.  Download Visual Studio Code - Mac, Linux, Windows

So I proceeded to download VSCode on my workstation so I can build the PSv7 script and run it on the File Server.  (installed PSv7 on my workstation and the file server)

Once installed either install the PowerShell extension during the setup process or go to Settings (bottom left), Extensions and find / install.

SharePoint PnP PowerShell Module:

I decided to use the PS PnP Modules as I felt that it greatly reduced complexity of the scripts.

Here's a great writeup of SharePoint PnP by June with connections instructions: How to Use SharePoint PNP PowerShell Module in Office 365 (

Note: you'll want to run Install-Module "PnP.PowerShell" when logged in as the account that task scheduler will be set to run as or the script will fail.

I had issues using his directions for non-interactive connections.  So, I used the following instead which I created a separate post for: Did You Restart?: PowerShell PnP connection using Azure AD App Registration and Certificates (

Import CSV to SharePoint List:

Now we can import our csv to a SharePoint List to use as our "free database".  Salaudeen nails it with his post on SharePoint Diary.  SharePoint Online: Import CSV File into SharePoint List using PowerShell - SharePoint Diary

SharePoint List: 

First let's setup a List on our SharePoint site.  I'm not going to detail these steps as you should be fairly familiar with this already.  I will point out however that SharePoint forces creation of column "Title" with the required flag on.  Rather than renaming this column I just went into the list settings, selected the column, and turned off the required flag.  Then I leave the Title column blank and ignore it.  

I also created the necessary columns in the SharePoint list to match the columns in the CSV.  So in my case WSID and ConnDate.


I used the second option from the SharePoint Diary with PnP and some slight modifications.

  1. We need to initiate the non-interactive connection to SharePoint online
  2. I wanted to replace the list each time, not add or update.

Script is copied from the SharePoint Diary site linked above.  I only added the connection and the line to get/delete all list contents.


$SiteUrl = ""

$ListName = "Connections"

$CSVPath = "C:\Scripts\Connections\Connections.csv"

#Connect to SharePoint Online non-interactive

Connect-PnPOnline $SiteUrl -ClientId 'yourclientIDfromPowerShellStep' -Tenant '' -Thumbprint 'CertificateThumbprintfromPowerShellStep'

#Get the CSV file contents

$CSVData = Import-CsV -Path $CSVPath

#Get all contents of the list and delete it!  No add or update

Get-PnPListItem -List $ListName | Remove-PnPListItem -Force

#Iterate through each Row in the CSV and import data to SharePoint Online List

ForEach ($Row in $CSVData)


    Write-Host "Adding Contact $($Row.WSID)"

     #Add List Items - Map with Internal Names of the Fields!

    Add-PnPListItem -List $ListName -Values @{"WSID" = $($Row.WSID);

                            "ConnDate" = $($Row.ConnDate);



#Read more:

Awesome script.  Great job Salaudeen.

Now all I had to do was save the script and make a scheduled task that runs as the user that I setup the Powershell PnP automation certificate under. Note: must also install-module "PnP.PowerShell" under that account.

Now, the csv is uploading to the List file and overwriting all contents at the schedule I set (really, we're deleting all content and then uploading the csv...)

Since the data is now in a SharePoint List we can access it with PowerApps / PowerAutomate without paying for the premium connector.  No, it's not real-time, but that hardly matters for some data.

PowerShell PnP connection using Azure AD App Registration and Certificates

I had a project I was working on where I wanted to automate uploading of a CSV file to SharePoint List.  Of course with all the security changes and MFA I needed to find a way to do it securely. 

That's when I found the following by June: How to Use SharePoint PNP PowerShell Module in Office 365 (

Using his directions for the Non-Interactive didn't work for me... but it got me on the right track.  Know that his directions may work fine and I just didn't do it right :)

I used the following to make this work:

  • PowerShell v7
  • Visual Studio Code (as replacement for ISE)
  • Windows Server 2016 and 2019, also replicated on Windows 10 and 11.

The following steps will be covered:
  1. Create and import SSL Cert
  2. Register App in Azure AD
  3. Set app permissions
  4. Set app certificate
  5. Connection string for script
Hopefully this will help me when I need to do it again in the future or anyone else that happens to read these notes!

Note: don't forget to run Install-Module "PnP.PowerShell"  

Create the Self-Signed Certificate:

Create the self signed certificate. Other options can be used, these are the basics.

New-SelfSignedCertificate -Subject "PowerShell PnP" -CertStoreLocation Cert:\CurrentUser\My

This is going to generate a certificate thumbprint.  Copy it into the next part.

Export the certificate as a CER and PFX

Export-Certificate -Cert Cert:\CurrentUser\My\PasteThumbprintHere -Type Cert -FilePath PowerShellPnPM365App.cer

$password = ConvertTo-SecureString -String "UberSecurePasswordHere" -Force -AsPlainText

Export-PfxCertificate -Password $password -Cert Cert:\CurrentUser\My\PasteThumbprintHere -FilePath PowerShellPnPHost.pfx

You now have a CER and PFX file.  The CER will be uploaded to Azure AD.  The PFX will be installed on the computer doing the automated scripting under the personal certificate store of the user account used for the automation.

Go ahead and login as the account that will be running the scripts.  Then install the PFX certificate with the password you chose.

Copy the thumbprint as you'll use that in your script.

Setup Azure AD:

Jump over to your Azure AD admin center and navigate to "Azure Active Directory" and then "App Registrations" then "All applications"

Click New Registration and give it a name.  No Redirect URI is needed.

This is going to give you a screen showing your new Application (client) ID and the Directory (tenant) ID.  Copy these both down as you'll need them later.

Click on Certificates and Secrets, and then click Certificates

Click upload certificate and select the CER that you created earlier.  Notice the thumbprint should match what you had earlier.

Now you can give your app permissions to the area you need.  In my case I chose to use API Permissions.  Click API Permissions, and then Add a Permission.  Choose the area you want to add, in my case it was for a SharePoint list so I picked SharePoint.

I then wanted it to be application permissions as this was for automation.

Here you can choose to give access to Full Site control, or you can narrow it down further.  I wanted to be somewhat granular in this case so I chose "Sites.Selected".

We now show the Sites.Selected, but notice the "Not Granted".  I then click the "Grant admin consent for ...." to grant permission.

The checkmark went green and permission now showed granted.

We also need to give permission to the specific site!  If you click on the permission you'll see the following: Allow the application to access a subset of site collections without a signed in user. The specific site collections and the permissions granted will be configured in SharePoint Online.

Let's hop back over to a PowerShell 7 window that has the PnP Module installed.

Connect-PnPOnline -Url "" -PnPManagementShell

You should be given a code to copy, and a link to a web browser login page.  Open it and login with Global admin to give consent.

Grant-PnPAzureAdAppSitePermission -AppID "Application(Client)ID Here" -DisplayName 'PowerShell PnP Automation' -Site "" -permissions Write

You can double check using


Now we can test that all works for our script!

$SiteUrl = ""

Connect-PnPOnline $SiteUrl -ClientId 'YourClientID" -Tenant '' -Thumbprint 'YourCertificatesThumbprint'

It should connect with no errors if all is happy.
Then we can test with pulling a list of all the SharePoint Lists on that site...

You should see your sites Lists.

Happy Automating!

Friday, July 1, 2022

Sage 300 ERP - ODBC error - Invalid Database Specification

We utilize Sage 300 ERP from a Windows RDSH environment.  After upgrading to a 2019 environment and reinstalling Sage I had manually created the ODBC DSN.  Unfortunately this caused errors for end users "Invalid Database Specification".  
You'll find a lot of info that this is of course due to ODBC connection issues, and I kept finding that it referenced that the end users didn't have permission to access the SYSTEM DSN.
Fixes include making registry changes, using User DSN, ensuring you're using ODBCAD32, firewall issues, etc.

None of these worked for me until I found a quick mention on a Sage user forum thread stating to "Run As Administrator" Sage 300 so it could create the ODBC connection itself.

After I cleaned up the changes I had tried and deleted the ODBC I had manually created I did this and low and behold, that created the DSN and it works for end users.  (see very last post)

I despise Sage 300, I think it's poorly programmed from a Systems Administrator point of view.  Maybe it's an "Accounting Software" thing as I greatly dislike Quickbooks desktop / enterprise as well, and a few other accounting software's I've worked with.  
Oh well, it's installed and working now... until I deploy 2022...

Monday, June 6, 2022

Microsoft 365 Tenant Migration with AAD Connect reusing same domain

Recently we had a need to migrate to a new tenant space largely due to COVID19 and extreme downsizing and company structure changes.  I can't stress enough that a successful migration is about planning and staging before any of the migration has actually begun.  Additionally, use AAD Connect to your advantage!  Convert as many cloud only accounts to internally sync'd accounts as possible as this can save you a ton of work.

Note: I'm not going over adding the necessary PowerShell modules Microsoft 365 that are needed.

This is not a comprehensive guide as each environment is going to have it's own unique areas, but this worked great for me and can be used as a template for someone else.

The migration had a few requirements.
  1. We had an internal domain connected with AAD Connect - which was not the primary domain in tenant.
  2. There were 8 domains total in the tenant used for various email accounts. 4 domains were moving, 4 were not.  The was moving
  3. A handful of users had a single user account with email addresses across all 8 domains!
I found that planning for this was very complicated, but execution was actually very simple!  Note that this was for under 50 users and I completed it solo.  I'm sure it could be simplified farther with more scripts or tools.  

Planning / Staging:

First I started with looking at all the different types of accounts and integrations that would be effected. In particular sorting through various service accounts setup for SMTP Auth when these accounts are both cloud only or internal sync.  Additionally, some users are cloud only accounts.

So to start, a handy export was needed from the Tenant - Users and Groups into csv.  You can then change this file to xlsx and start adding more columns and add a filter.  I recommend then adding columns for identifying which accounts are for SMTP Auth, break down which are AADSync (which is included in the export), which accounts need to have Forwards in place to new tenant, which domains are moving, etc.

Now came pre-staging of the OLD environment.  
Since I had accounts that had emails across all 8 domains I began to identify them and break them into 2 accounts.  The one that had domains that would be moving set as Internal AD accounts.  The ones not moving I created new cloud only accounts and setup forwards to their internal account.  This could easily be scripted for large groups.  I used Shared Mailboxes to save on cost.  I used Forwarding instead of adding permissions to the shared mailbox so that after migration I wouldn't have to change them again.
I did the same thing for any Shared Mailboxes, Distribution groups, etc.  Get everything for the domains that are moving converted to AAD Sync internal groups if possible!  With this when you reconnect AAD Connect to the new tenant and do the first sync all of your work will be done for you.

I also moved SharePoint and Onedrive.  For SharePoint our sites were small and not built out, so a simple "Mover" migration was sufficient.  You can find a link to mover in the SharePoint admin console under migration.  I didn't actually migrate Onedrive, instead we handled that from the client computer end (ie disconnect then reconnect to new tenant and let it upload everything again).

Create new Tenant:
You can create your new tenant at any point, you'll just need to add a valid license to it.  Of course the tenant name will be, make it a good one this time!  Learn from my mistakes, DON'T make it the company name, who'd of figured those could change so often... marketing people... ;)
Create a user account for each user account that will be moving.  This can easily be scripted.  They will all have for their username. 
Note: we licensed ours several days prior to migration.

I also added one AAD Premium P1 license and one Azure Information Protection Premium Plan 1 to the new tenant.  This allowed us to do a lot of configuration to the environment prior to migration.
Ie, SharePoint pre-stage and config, Exchange rules and other config, Spam filter, and the gobs of other settings that I wanted locked down.

Week prior to Migration:
We used BitTitan MigrationWiz to move our mailbox information.  So, we did a pre-stage less 30 days 1 week prior to the go day.
I also informed everyone that everything was going to break on the "go" day.  For simplicity sake we had each user leave their computer turned on so that we could manually fix their Teams, Onedrive, Outlook, Office apps, and MS Edge Sync on day of migration.  This was doable for us do to our small user count.  I'm sure there are better ways...  More on what I did to fix each app at bottom.

I also pulled all of the LegacyExchangeDN just in case I needed them.  Easier now then later...
Get-Mailbox | Select Name, PrimarySMTPAddress, LegacyExchangeDN | Export-Csv 'pathtofile\LegacyExchangeDN.csv' -NoTypeInformation
Get-DistributionGroup | Select Name, PrimarySMTPAddress, LegacyExchangeDN | Export-Csv 'pathtofile\LegacyExchangeDNgroups.csv' -NoTypeInformation
Create a user migration List:
Also, I created a csv file of all my users I had pre-staged in the new tenant.  On day of migration their UPN will need changed prior to reconnecting AADConnect and I wanted it done easy.
CSV file needs to have at least 2 columns with the following headers.  Name this userCloud.csv
PrimarySMTPAddress and UPN
PrimarySMTPAddress is the username in the new tenant, ie
UPN is proper primary email address you will want them to have. ie

Day prior to Migration:
On the day prior I ran another Pre-Stage with MigrationWiz to get everything up to that day.  I don't want to be sitting around for hours waiting for the final staging.

Day of Migration:
  1. Changed MX Records to an invalid record for each domain.  This made any mail sent to us get "held" by the sending server for retry instead of giving back an NDR.  I want all that mail to come through once I've moved the domains.
  2. Run the final MigrationWiz, I also removed everyone's access from SharePoint.  Wait for final pass to finish before proceeding!
  3. Add an empty root OU to AD, this is temporary.
  4. Run AADConnect configuration, and point it to that empty root OU you just created.  When the sync runs there with be NOTHING to sync and so it will process this as removal of ALL of your AADSync objects.  Just like that it removed everything for you so you can remove your domains.
  5. Remove any objects that were cloud only objects for the moving domains.  
  6. Under Settings - Domains - click on each domain and go through the tabs, you'll see what objects are left on each domain.  Once the domains that are moving are cleared of all objects you can delete each of the domains!
  7. Now you can go to your new Tenant and add each domain. Hint: use a different web browser for each tenant so you're not having to constantly login and out. For PowerShell use 2 different VM's.
  8. Now we're going to fix the UPN for all of the new Tenant pre staged users.  You're going to use the CSV you created with Powershell
    1. $users = Import-Csv 'path to file\userCloud.csv'
      foreach ($user in $users){Set-MsolUserPrincipalName -UserPrincipalName $user.PrimarySmtpAddress -NewUserPrincipalName $user.upn }
  9. Now all of the users in the new tentant have the proper accounts that MATCH their internal Active Directory UPN's.  This way AADSync will automatically associate them properly.
  10. Go back to AADSync and configure it to point to your proper OU's again.  Let the sync run and bingo, you now see that it associated properly AND all your groups are back and created / populated with group membership.
  11. Fix your MX Records and test!  Don't forget to setup SPF, DKIM, DMARC again as needed.
  12. Note that you need to create "cloud only" distribution groups and add membership back.  If you converted them to Active Directory prior then they where automatically created by the sync
  13. Test your emails setup out again!  Send and Receive
  14. Now it's time to fix apps

Fix Apps: 

I found this part to be the worst.  Overall it went fine, but it's tedious.
Note this is for Windows 10 only.  Other OS's may be different.
Some machines signing out in one place caused others to auto sign out.  Some didn't, IDK.
  1. I started with ensuring everything was closed.
  2. Opened Control Panel, switch to small icons, open Mail, Show Profiles, Delete
  3. Opened Excel (or Word), File, Account, Sign Out
  4. Open Settings, Accounts, Accesss work or School, expand the account, Disconnect.
  5. Also checked under Accounts, Email & Accounts, and removed anything I could. 
  6. Opened Edge browser, Settings, Sign Out of profile, Did not clear their favorites and other info.
  7. Dumped linkes to SharePoint as I saw this (and added the new site)
  8. Opened OneDrive and Unlink this PC (note it prompts that it will stop synching and a copy of the files will be left on the PC).
  9. Open Teams and Sign out
  10. Reboot
  11. Open up each and setup new.  Edge, OneDrive, SharePoint, Office, Teams.  Note that GPO's or Azure AD can help do this automatically for you with SSO and device mgmt if you have it.
Finally, notified users to sign out of Teams, SharePoint, Onedrive, etc on their mobile phones.  In Outlook mobile app delete the account (for Onedrive connector too) and add back new.

Clean Up!

At this point you should be back up and running.  Time to just start combing through settings and objects and doing any cleanup that is necessary.  Hopefully if you did this it went as well as mine did.  

Don't forget your scanners and other components that use SMTP Relay or alerting of that sort!
I'll also throw SharePoint in here, I had used Mover to move everything after the fact and then manually added back permissions.  We weren't using a lot of SharePoint at the time so it wasn't a big deal.  Of course, if you're using powerapps, powerautomiate, lots of SharePoint, and other then you'll want to spend more time than I did looking at these solutions.  (we do now, and what a nightmare that would be all on its own!)

Hopefully this helps someone.

Wednesday, January 12, 2022

Godaddy - new certificate crt and pem files, but need pfx

My brain mostly "seems" to be able to contain valuable information.  But for some reason this piece of valuable (at least 1 - 4 times a year) is never retained.  Each year I find myself pulling out the google foo to find a solution.

The issue:
I'm either purchasing a new cert, changing an existing cert, renewing a cert, etc.  I go to Godaddy, run through the process and get the download files.  I'm given a CRT, PEM, and intermediate.p7b files.
I need a PFX file.

Google foo always gives me plenty of articles about using openSSL typically.  They typically involve running a command which looks promising, but I KNOW I didn't use openSSL last time...
I think my downfall with this is that I usually type in something like "Godaddy convert CRT to PFX".  The missing part is that I actually have a PEM which is what's important to me here.

Since I typically generate the CSR from within one of my IIS instances all I have to do to get the PFX is go back to IIS.  Complete the signing request, and when asked for the new file give it the PEM.
Now, right click and export to PFX, give it a password and finish my project.

Now next year (or month) when I can't remember this easy process for the 100th time hopefully my google foo will see my own post OR I'll finally commit this to memory.

/wr mem

Thursday, October 21, 2021

NEC SV9100 inMail and Exchange Online / Microsoft 365

We've utilized NEC SV9100 with inmail for voicemail for 5 years.  We also utilize Microsoft Office 365 / Exchange Online.  

Setting up the voicemail to email feature is fairly easy and there are lots of guides online to do so.  For that matter, being in IT and setting up system SMTP for scanning, alerting, etc, etc, etc is like brushing my teeth.  So, looking at the inmail settings for SMTP was enough to make me yawn, grab a cup of coffee to help stay away the boredom, and get to work.

Ten minutes later all done, tested, working... seemingly.  I had put in the port 587, TLS, username, password, blah blah blah.  In fact, I did this almost 3 or 4 years ago.  

Fast forward to yesterday.  Complaint comes in about "I've called and left a VM and no one contacted me". Of course, that triggers the CEO to call, leave a message, and then send out the email "Who got that message?  call me".  Quick looksee anddddd, well, no one got the message WTH.  I call the number, leave a message and seconds later have the message.  Call again, receive message.  Start to suspect the number the CEO called or the classic "What did the user do wrong?".  You know PEBKAC.

At this point I decide PEBKAC is wrong (since it's the CEO) and call into the VM box directly (which btw no one checks because it's an email forward only mailbox) and listen to the messages.  I hear me testing, I hear me testing again, I hear a fax machine crap message, I hear the CEO asking for someone to call him... Definitely not a PEBKAC, but rather an OHCRAP.

After a quick chat with a friend that is an NEC Certified Tech I find that I'm not the first to see this issue.  As soon as the words "inmail Office365 random issue" come out of my computer he stops me and responds with a resounding "Yesssss, we never recommend that".  

Here's the thing, directly inputting an account into SMTP settings on inmail so that it can authenticate and send works and from my experience it almost always works.  BUT when you can't lose an occasional random message from a customer, "almost" isn't good enough.

According to my friend and online searches the general accepted method is to use Gmail, local relay, or Option #2 or Option #3 of this document. (Note: I was using option #1)

How to set up a multifunction device or application to send email using Microsoft 365 or Office 365 | Microsoft Docs

Option #2 and Option #3 I see lots of comments online of working, but in my mind Option #1 looked like it was working to me.

In the end I decided to go the tried and true way that hasn't failed me yet IIS SMTP Relay.  Alteratively using an onsite Exchange Server, HMailServer, or other reliable method would be acceptable.  Basically, I wanted the mail to have a quick trip locally to an email Queue.  With this I can even write a Powershell script to monitor it if desired.  At the very least I'm not depending on some online authentication to occur between the NEC and Microsoft which could fail mid communication.

If you haven't setup IIS SMTP Relay before, well, it's pretty easy.  Google how to install if you don't know.  I'll give the quick config to make it work with the NEC.  I usually do this on my Print Server or another lightly used server.  Note that it does require installation of the role IIS.

  1. Add a secondary IP address to the server (don't do this on a DC). I prefer to run each SMTP Relay on it's own dedicated IP.
  2. Create a new home directory (will be used in later step).  I usually put this in C:\Inetpub\New Name.  The "new name" I typically make named the task that this relay would be for.  IE, voicemail or NEC.
  3. Open up Internet Information Services (IIS) 6.0 Manager (of course after you've installed the required roles)
  4. Right click on the server name, New, SMTP Virtual Server

  5. Give it a name.  I like to name them the task followed by - and the last octet of the ip address assigned in step 1.  Example: NEC - .44
  6. Select the IP assigned to the server in Step 1
  7. Select the Home directly we created in Step 2
  8. Enter a domain name.  I typically make this the servers FQDN.  DO NOT make it the domain name of the email that these are going to.  For instance, if the account you're emailing this to is then you would not want to enter or the emails will go into the "drop" folder because it's a "local" address.  In my case the FQDN is different than the email domain so I enter FQDN :)  If your emails are going to the Drop folder (more on this in a minute) then check this.
  9. OK and you'll be presented with a new pretty SMTP relay

  10. Right click on the "NEC - .45" / virtual server and select properties
  11. Ensure "Limit number of connects to" is unchecked
  12. On Access tab, click Relay, Only the list of below, Add the NEC ip address, and I uncheck the "Allow all computers which...."
  13. Messages tab.  I change the limit message size and session size to 20480 (ie 20MB).
  14. Delivery tab, I change the expiration timeout to 4 days.  
    1. Outbound security.  This will depend somewhat on where it's going, but in my case I require authentication.  This will mostly depend on how you want to setup your SMTP Relay server using that previous link in my post.  As you can see, we're moving the Microsoft Option 1, 2 or 3 to here.  So the SMTP Relay is the one authenticating with Exchange online instead of the NEC.
      1. So, I change this to Basic Auth, enter the username of my Voicemail account, password
      2. Check the TLS Encyrption option
    2. Outbound Connections, change TCP Port to 587
    3. Advanced, change the Smart Host to
    4. Hit OK to exit out of the properties.
  15. Restart the Simple Mail Transport Protocol service (not sure if this is required)
  16. Now we test it.
  17. Make a file on the desktop of the server or somewhere named Test email.  Remove the file extension from it so that it's extensionless. 
  18. Open the file with Notepad or Notepad++
  19. Enter the following 4 lines.  Notice there are no spaces
      1. If you're using Option 1 from MS then the email address entered must match EXACTLY the account you're using to send Voicemail.
      2. Option 2 and 3 it must match any email address in your Exchange online environment. (so it can be a dist list), but note that means Step 14 Outbound security will be different as well. (maybe I'll change mine and update this post at a later date)
    3. Subject:Test
    4. Test Test (this is line 4 which is the message body)

  20. Save the file
  21. Create a copy of the file
  22. Open up file explorer to C:\inetpub\voicemail\pickup and drag and drop the copy you just made into the folder.
  23. It will instantly disappear.
  24. Go to the C:\inetpub\voicemail\drop and badmail directories to see if it's there (hopefully not).  If not then you probably got the email.
  25. If it's in Queue then something doesn't match up properly and it's gone into retry mode.  This could be that the credentials are wrong, no path out, you didn't setup Office 365 properly, etc.  Basically, it can't deliver to Office 365.  If you wait long enough (4 days) it will eventually move to badmail.
  26. If it's in badmail, then most likely issue is the From email address doesn't match up properly and it was rejected.  
  27. If it's in Drop, then from my experience this typically means I forgot my own advice and made the smtp virtual server domain the same as my email domain.  To fix this expand the tree, and in the right windows double click and change the domain.

So what was the point of this post?  This is all over the googles if you search for it...  I intend for this to be one more post that shows on the googles when people like me search to setup inmail with office365 so that others hopefully don't run into the random missing voicemail when all appears to be working OHCRAP moment.  My failure is online so hopefully you don't have this failure.

Have a better option?  Post it! 

Wednesday, October 20, 2021

IIS 7 SSL Cert - There was an error while performing this operation

It was that exciting time of year again, SSL Cert renewal time!  

I say exciting, because it never fails that when Cert renewal times comes up I hit my head against some issue (I suspect it's the exact same issue year after year and I just don't remember).

This time changing the cert in IIS 7 I'm greeted with "There was an error while performing this operation. Details: A specified logon session does not exist. It may already have been terminated. (Exception from HRESULT: 0x80070520)

It should be noted that when this occurred the site went down!  I was able to select the old cert and hit okay and all was well again.  Select new cert, OK, and error with site down again.

NOTE: I have since found another way to produce this issue with it's own fix.  I have modified the below with Fix 1 and Fix 2.  You may have to do BOTH of the below as I recently discovered.

I found a lot of solutions out there and I'm sure they work, but I didn't see the easy one that worked for me.  I also found some that say the solution is that you have to have "export private key" checked when importing the certificate (note that this IS NOT NEEDED).

FIX 1: I had my certificate imported from a pfx without the option for export private key.  It was stored under Local Computer - Web Hosting (this is true of the old cert and new cert).

In the binding screen I selected the "Localhost" certificate.  Hit OK

I then immediately hit edit again.  Selected the new certificate from the drop down and hit OK.  Click Close, go to your site and verify it's using the new cert.

FIX 2: I had a new certificate that I imported via the IIS Server Certificates option.  No matter what I would continue to get the error following my directions above.  I found a post online where a commenter mentioned that they had to import from MMC rather than IIS.  Deleted the cert that I had imported via IIS.  Had cmd open so went to it and typed MMC, File - Add/Remote Snap-in - Certificates - Computer Account - OK. Expand Web Hosting - Certificates. Right click import my new cert changing file type to *.* and selecting cert.  DO NOT check the box for exportable.

Then went back to IIS and followed my FIX 1 steps.  Worked great.

No error, very minimal downtime (when localhost cert is selected). Happy happy

Now, will I remember this next year?  Or remember to check my blog notes?  Probably not.

Thursday, July 16, 2020

Trend Micro Worry Free Business - very slow opening of apps

We recently switched from Webroot to Trend Micro Worry Free (I now believe this was a mistake).  Almost immediately I started getting reports of "computer slowness" and started noticing this myself.  Primarily I had issues with Onedrive having issues synchronizing, opening Chrome and Edge (chromium) very slow, clicking links in emails (again opening browsers) slow, logging into Windows after a reboot long delay, slow loading of additional tabs / webpages, and other areas.

This appears to be a well known issue when using Trend Micro with "Unauthorized Change Prevention Service".  Watching the task manager when doing many of the tasks and I could see this service jump to the top.
Unfortunately, many of the TM options are dependent on this service, but at the end of the day I'm a firm believer that machines need to be speedy, so I disabled the service.  Note: I also disabled the Behavior Monitoring as this is dependent on the service.

If you're reading this while "thinking" of moving to Trend Micro I would advice you to take a test drive first.  I've found several issues which support is working through, but it's been a bumpy road.

  1. Extreme slowdown when scheduled scans run (as opposed to what we're used to seeing with Webroot).
  2. Unauthorized Change Prevention Service slowdown.
  3. Issue with builds prior to 6.7.1319 being unable to restore to domain OU's.
  4. Issue with many of our installs prior to 6.7.1319 being unable to update to latest build automatically - support still looking into issue.

Tuesday, June 2, 2020

Dot net 3.5 install error

I've had lots of issues in the past with being unable to install Dot Net 3.5 on Windows 10.  Typically, I can easily load the Win10 ISO, mount it, and use DISM with the sources switch.  Today I started encountering 2 laptops running Windows 10 that I continued to have issues and errors.

ISO mounted and received "the source files can't be found".  This was with the latest Win10 Iso download.

Checked WSUS and feature on demand is checked.

Easy fix is to bypass WSUS temporarily...
UseWUServer set to 0
Install Dot Net 3.5
Set the reg key back to 1

Monday, March 30, 2020

Windows Server 2016 RDSH - Start Menu stops working

On our farm of Windows Server 2016 RDSH (Remote Desktop Session Host) I've had seemingly random issues with the start menu stopping working.  This likely correlates with a Windows update being applied, but it's hard to tell as you do not always know immediately that it's stopped working (users complain days later or never complain and you notice when doing other maintenance, etc).

Searching the internet you find a number of solutions, but the most crazy (in my opinion) solution I found was the one that actually worked! 

In this post user MrManual says to delete and recreate a registry key dealing with the Firewall.  One, like me, would think this crazy and continue on trying all the other solutions only to have the issue remain (or return shortly).

Finally, figuring it's best to try a crazy solution than rebuild the server I open powershell and give it a go:

Remove-Item "HKLM:\SYSTEM\CurrentControlSet\Services\SharedAccess\Parameters\FirewallPolicy\RestrictedServices\Configurable\System" New-Item "HKLM:\SYSTEM\CurrentControlSet\Services\SharedAccess\Parameters\FirewallPolicy\RestrictedServices\Configurable\System"

Click start menu and GASP it opens!

Note: other ideas on the thread do work, but seemingly only temporary.  I still suspect this to have something to do with the crappy UPD's.
On the note of UPD's one might ask "if you hate UPD's so much why not switch to fxlogic?  I mean, it is free afterall..."

Saturday, March 21, 2020

Dell Latitude 7480 / 7490 loud fan issue

We have a lot of Dell Latitude 7480 / 7490 laptops deployed.  When I first got them in we had lots of issues and complaints about the loud fan speed.  Under load this is understandable, but many times this would be with no load.   This is a common issue early on for these models as one can see from the numerous posts online:

In the past when I would get one of these laptops it was a matter of ensuring the BIOS was up to date and the issue would be gone.  Lately, my own laptop (7490) started having high pitch fast fan noise.  Of course I remembered right away that I had recently updated the BIOS to 1.13.1.
I quickly decided to do a BIOS downgrade to 1.11.0 to see if that would help.

No more loud fan noise at this point... Having issues with your fan always running top speed? Try an older BIOS version and call Dell rep to complain.

I recently allowed a BIOS update to install and the issue came back on a Latitude 7490.
I then installed the Dell Power Manager application and found a section called "Thermal Management".  Under this section you can choose "Quiet", this instantly made the computer more bearable. 

Sunday, September 15, 2019

Have a device (Roku or other) that won't connect to wifi?

I have a sister-in-law that bought a new Roku express this weekend.  She spent 4 hours fighting an issue where it wouldn't connect to her wifi claiming that the passcode is incorrect.  She searched forums, called xfinity support, and Roku support all to no solution.  She found that she should enter the MAC address in the router which didn't help.  Reset her router passcode, but why when every other device is working on the wifi just fine with that passcode. Change the WPA2 AES settings to something else.  Again why, the other devices are working fine.

Finally she decides to call me.  After about 20 seconds looking at her router settings I advise making the 2.4GHz and 5GHz wifi networks the same password.  Since the Roku Express only supports 2.4GHz it's trying to connect to 2.4, but since they are different passcodes and the same SSID there is nothing indicating to her that she needs to enter the 2.4GHz passcode.  In fact she didn't even know it or that there was ANY difference as Xfinity staff set it up.

Immediately this resolved the issue

Make them the same SSID and Passcode and let it just work.  The device will connect to the frequency it wants / supports and the end user doesn't need to care.  Or if you insist on different passcodes for some reason, make the SSID different as well as a visual indicator.

Thursday, June 27, 2019

Testing your website for weak ciphers and protocols

With recent deployments and integrations of systems I have had to ensure that several websites are secure. After digging around and setting registry keys I figured someone else has done this already, so I started looking for a quick script.

One better I found this handy software:

These guys have it setup so you can set the Schannel, and Cipher Suites plus orders.
Then click the site scanner and you'll see the familiar Qualys SSL Labs site.

Monday, February 18, 2019

Wyse ThinOS and RD Gateway with Broker - External Access

The other day I was able to get my hands on a Dell Wyse 3040 with ThinOS unit. I wanted to test out connecting to a Windows Remote Desktop Gateway with Connection Broker and RDSH from home. My intended end users are at remote sites with VPN connections, but I had other ideas for some remote workers to utilize these devices (without VMWare or Citrix) to connect in.

This post isn't about setting up RDSH, RDGateway, etc.  This is in line with getting ThinOS 8.6+ working with your RD Gateway and RD Connection Broker to RDS Hosts.  Something that in hind sight was very easy, but took me a bit to weed through the online posts, ini settings, etc.

I used Wyse Management Suite to configure the device (online trial). This has been a great option and works very well.  For production I will be deploying WMS Standard onsite.

Windows Remote Desktop environment layout:
The environment consists of the following layout. 
  • All servers running Windows Server 2016
  • 1 server with RD Gateway and Web installed together.  We'll refer to this as
  • 1 server with Connection Broker installed (NOT in HA config)
  • 2 servers running RDSH and the desktop being published - Collection Name: Desktop Resources
  • Dell Wyse 3040 ThinOS 8.6_013 connected to my home network. NO VPN to main datacenter.
Goal: To get the 3040 to connect through the and broker the connection to the proper RDSH server.  I want it to prompt the user for login upon boot and upon disconnect to logout of the gateway and prompt for login again (Shared workstation).

WYSE config:
I'm going to break this down by section in the WMS portal.  Then I will do my best to put the wnos.ini out.  Obviously there are other areas to configure, I'm just giving the basics for the RDGateway to work.

Require Domain Login: First area of interest to me was to disable the "Require domain login".  I want the thin client to load and prompt with the connection to the RD Gateway. 

Certificates: Depending on the CA you used on your Gateway you'll need to import the certificates.  I used Godaddy so I had to get the .cer for the Root and Secondary.  This was as easy as going to my site, viewing the certs, and then downloading (copy to file) the GoDaddy Root CA and GoDaddy Secure CA to files.  From there you will upload both files into Apps & Data tab under the File Repository (select certificate for the type).
Now you can check the option for certs and you will see all of the certs you need listed.

Security Policy: I set mine to Full
TLSCheckCN: enabled
VNC: I turned on VNC to allow ease of testing

Visual Experience:
Action after all sessions exit: "sign off automatically"

Microsoft Broker:
Broker Server:
This should be set to your gateway server.  Include the https:// but do not including anything past the FQDN.

Sessions to connect automatically: Desktop Resources
This is the collection name.  Since in this case I'm pushing out a collection of desktops there is only the collection name and not app names.  

Microsoft RDP Settings:
Enable NLA: Enabled  
In my environment I have this on for all servers.

That's it.  Restart the device to apply and test it out.  Notice that when you logout it puts the workstation back at the login screen, perfect for shared workstations!
Note that I did NOT put any Direct RDP Connections in as this isn't needed.

here's the devices wnos.ini as delivered from WMS.

Signon=Yes SaveLastDomainUser=no LastUserName=No
AddCert="Go Daddy Root CA - G2.cer"
AddCert="Go Daddy Secure CA - G2.cer"
SignOn=No ExpireTime=0 RequireSmartCard=No SCRemovalBehavior=0 DisableGuest=No
SecurityPolicy=full SecuredNetworkProtocol=Yes TLSMinVersion=1 TLSMaxVersion=3 DNSFileServerDiscover=Yes TLSCheckCN=Yes
AutoSignoff=10 Shutdown=no Reboot=no
SysMode=Classic toolbarclick=No ToolBarAutoQuit=No EnableLogonMainMenu=No
AutoLoad=2 VerifySignature=yes
ConnectionBroker=MICROSOFT \
host= AutoConnectList="Desktop Resources"
SessionConfig=all \
SessionConfig=rdp \
EnableNLA=yes EnableRecord=no EnableRFX=yes EnableTSMM=no ForceSpan=no enablegfx=no EnableUDP=yes EnableVOR=yes USBRedirection=rdp defaultcolor=2 MaxBmpCache=128 RDPScreenAlign4=no AutoDetectNetwork=yes EnableRdpH264=yes