Blog

Category: Technology

Restart Service on Remote Computer with PowerShell

If you search for "PowerShell Restart Service Remote Computer", the challenge is a lot of the top results will be for earlier versions of PowerShell and a lot more complicated than needed.  For some reason, Microsoft doesn't just give the Restart-Service command a -ComputerName switch...I guess that would be too intuitive.   After some digging, I found it's still easy, you just need to Get-Service first.  Below is an example of restarting BITS.  I used a wildcard just to show you that wildcards work.  Run this from an elevated PowerShell prompt and replace <COMPUTERNAME> with the name of your computers.

Get-Service -Name BIT* -ComputerName <COMPUTERNAME> | Restart-Service

This works great for restarting a service on a few remote computers.  If you need to restart a service for all the computers in your domain, here's a script to help with that process.  This script does need to be run from an AD server as it requires Get-ADComputer or you'll need to install the PS module on the server where you'll be running this script.   The value on this is it pulls out all the active computers and restarts you're selected service.  This helped us on an issue where our remote access software ScreenConnect started dropping out of our console.  A bug in their keep alives is causing the issue and the temp fix is to restart the service...but sometimes it's hard to remember what's missing so this goes through all computers active in your domain and restarts the service.

$today = Get-Date
$cutoffdate = $today.AddDays(-15)

Get-ADComputer  -Properties * -Filter {LastLogonDate -gt $cutoffdate}|Select -Expand DNSHostName  | out-file C:\All-Computers.txt

$computers = get-content "C:\All-Computers.txt"
$amount = $computers.count
$a=0

foreach ($computer in $computers) {
       $a++
       Invoke-Command -ComputerName $computer { Restart-Service -Name 'ScreenConnect Client (f95335af7be34c6f)' } -ErrorAction SilentlyContinue
       Write-Progress -Activity "Working..." -CurrentOperation "$a Complete of $amount" -Status "Please wait.  Restarting service."

}

 

Enjoy!




Can't Change Display Brightness on Windows 10

If after upgrading to Windows 10 you can't change your display brightness (the option may be missing in Settings/Display), this may be your fix.

  1. Right-Click the Start button and select Device Manager
  2. Expand the Monitors section
  3. Right-click on Generic PnP Monitor and click on Enable

After fighting to find the latest video drivers (HP hasn't released Win 10 drivers for my Pavilion), this fixed my issue.




Windows 10 Fix for Remote Gateway VPN Bug

When connected to a vpn, we often want to continue to use our connection for Internet traffic rather than forwarding it through the tunnel.  With Win 10, there's a bug that prevents us from clicking on the Properties button for TCP/IP v4. David Carroll posted a fix here .  Other posts give fixes of editing the RAS phonebook, but David's PowerShell method is much easier to me.  I'm posting the steps here as well (mostly for my records).  One key, I've noticed that if you have a space in the name of your VPN the Get-VpnConnection won't return your connection info.  So if you have a connection like "VPN 1", this method wouldn't work for me but if you name it "VPN1" it works fine.

  1. From PowerShell, type Get-VpnConnection while connected to your VPN.  You'll notice that the SplitTunneling is set to False.
     
  2. Set-VpnConnection "VPN1" -SplitTunneling 1 (replace VPN1 with the name of your VPN returned from the Get-VpnConnection.
     
  3. Disconnect from your VPN session and reconnect.  
     
  4. In Bing or Google, just type "what is my ip" and you should have your local internet IP rather than the one going through the VPN.

That should do it.  

 




SQL Azure Table Size

I've been fighting with a DotNetNuke install hosted on Azure for a while.  We've been testing as there's a lot we like about Azure but the performance when editing DotNetNuke has caused us to go a different route...but that's another story.  In our test site, the SQL database size just kept growing.  To find out the culprit, I wanted to know the size of each table.  This post from Alexandre Brisebois did just what I needed.  My only tweak was to Order By the size.

SELECT    
      o.name AS [table_name], 
      sum(p.reserved_page_count) * 8.0 / 1024 / 1024 AS [size_in_gb],
      p.row_count AS [records]
FROM   
      sys.dm_db_partition_stats AS p,
      sys.objects AS o
WHERE    
      p.object_id = o.object_id
      AND o.is_ms_shipped = 0
      
GROUP BY o.name , p.row_count
ORDER BY size_in_gb DESC

For me, it was pretty clear that I needed to truncate the DotNetNuke EventLog and ScheduleHistory.  In production, you would want to schedule this as it's amazing how quickly these can grow.  We hadn't put this site into production yet but the EventLog was 20 GB after just a few months.

truncate table EventLog
truncate table ScheduleHistory

This may or may not be related to anything you need, but for me in this SQL Azure situation, I noticed that even though I had truncated about 20 GB's of data the Azure Dashboard didn't reflect that.  Digging around it seems to be related to my indexes being fragmented.  So, following Dilkush Patel's post I ran this query to see my fragmentation. (Note: I did make a minor change to Order By the % of fragmentation.

SELECT
 DB_NAME() AS DBName
 ,OBJECT_NAME(ps.object_id) AS TableName
 ,i.name AS IndexName
 ,ips.index_type_desc
 ,ips.avg_fragmentation_in_percent
 FROM sys.dm_db_partition_stats ps
 INNER JOIN sys.indexes i
 ON ps.object_id = i.object_id
 AND ps.index_id = i.index_id
 CROSS APPLY sys.dm_db_index_physical_stats(DB_ID(), ps.object_id, ps.index_id, null, 'LIMITED') ips
 ORDER BY ips.avg_fragmentation_in_percent desc, ps.object_id, ps.index_id

For me, I had several tables over 60% fragmented.  So I ran Dilkush's script:

DECLARE @TableName varchar(255)
 
 DECLARE TableCursor CURSOR FOR
 (
 SELECT '[' + IST.TABLE_SCHEMA + '].[' + IST.TABLE_NAME + ']' AS [TableName]
 FROM INFORMATION_SCHEMA.TABLES IST
 WHERE IST.TABLE_TYPE = 'BASE TABLE'
 )
 
 OPEN TableCursor
 FETCH NEXT FROM TableCursor INTO @TableName
 WHILE @@FETCH_STATUS = 0
 
 BEGIN
 PRINT('Rebuilding Indexes on ' + @TableName)
 Begin Try
 EXEC('ALTER INDEX ALL ON ' + @TableName + ' REBUILD with (ONLINE=ON)')
 End Try
 Begin Catch
 PRINT('Cannot do rebuild with Online=On option, taking table ' + @TableName+' down to rebuild')
 EXEC('ALTER INDEX ALL ON ' + @TableName + ' REBUILD')
 End Catch
 FETCH NEXT FROM TableCursor INTO @TableName
 END
 
 CLOSE TableCursor
 DEALLOCATE TableCursor

However, after running this (several times in fact) the usage in my Azure Dashboard has not changed.  I'll wait and see if it changes and update this post if it does.




Azure: The database 'xxx' has reached its size quota

I keep bumping in to this issue so I thought I'd post my steps to resolution. When you create an Azure SQL database, it sets a size limit for the database. When the database fills up, you'll get an error like this:

The database 'XXX' has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions.

The fix requires 2 steps: First you need to increase the database size in the Azure portal but then secondly you need to alter your database in SQL.

  1. Login to the Azure Management Portal and go to your SQL Database.
  2. On the Dashboard tab, you should see that the size of your database is 100% of your total in the Usage Overview.
  3. Click on the Scale tab and change the Max Size and click Save.

 

That fixes the Azure max size limit, but now we need to update our SQL database itself. 

  1. Fire up SQL Management Studio and connect to your Azure SQL Server.
  2. On your database, open a New Query.
  3. Run this query replacing YOURDATABASE with the name of your database in both locations.

SELECT DATABASEPROPERTYEX('YOURDATABASE', 'EDITION') as Edition, CONVERT(BIGINT,DATABASEPROPERTYEX ( 'YOURDATABASE' , 'MAXSIZEINBYTES'))/1024/1024/1024 AS 'Max Size IN GB'

  1. This shows you your Azure Database Edition and your current Max Size.  Now go run this query on the MASTER database changing the Database Name, Edition and Maxsize values as needed.

ALTER DATABASE YOURDATABASE MODIFY (EDITION='Standard', MAXSIZE=40GB)

Hit refresh on your site and it should now come up without error.  

 

 

 



 




SQL Server Migration Assistant for Access nightmare

Getting from Access to SQL is not as much fun as it should be and it seems that it gets harder with each release. The upsize tool in Access 2013 is gone and the recommended way is now to use the SQL Server Migration Assistant (SSMA).

Like a lot of people, I run Windows 64-bit OS with 32-bit Office (which is Microsoft's recommendation. When running SSMA, I kept hitting the following error:


Access Object Collector error: Database

     Retrieving the COM class factory for component with CLSID {CD7791B9-43FD-42C5-AE42-8DD2811F0419} failed due to the following error: 80040154. This error may be a result of running SSMA as 64-bit application while having only 32-bit connectivity components installed or vice versa. You can run 32-bit SSMA application if you have 32-bit connectivity components or 64-bit SSMA application if you have 64-bit connectivity components, shortcut to both 32-bit and 64-bit SSMA can be found under the Programs menu. You can also consider updating your connectivity components from http://go.microsoft.com/fwlink/?LinkId=197502.

     An error occurred while loading database content.


 

Based on this post, I'd regsvr32 Da0360.dll and added that folder to my environment's PATH but neither helped.  For others, running the 32-bit version of SSMA was the suggested fix but this also didn't work for me.  I almost went down the path of setting corflags but just didn't feel that that was my issue.

Thinking it was all about 32bit and 64bit, I pulled out my tablet which is Win 8.1 32-bit but it had the exact same error.  Finally, I found this post which had the fix which is to install the Microsoft Access Database Engine 2010 Redistributable.  The post used the 2007 edition but I used 2010 and it worked fine.  I think it may work with Microsoft Access Database Engine 2013 but once it worked with the 2010 edition, I moved on.  I'm not sure why having Access 2013 isn't enough but I know a lot of other people are struggling with this issue and not getting much support from the SSMA team.  In fact, support for that product seems really lacking.  On the SSMA 5.2 version page (5.3 is out and the one I installed) there were several comments (some very frustrated) with my exact issue but no response from the Microsoft team.  I emailed their help address which replied with an auto-response to open a ticket which I did, but still no response.  Hopefully this post will help someone and you won't feel so alone. smiley

 




DPM 2012 fails to backup SQL 2012 database

Our SQL 2008 backups were working just fine with Data Protection Manager 2012 until we upgraded SQL to SQL 2012.  Then we started getting the error:

The DPM job failed for SQL Server 2012 database <SQL database> on <our sql server> because the protection agent did not have sysadmin privileges on the SQL Server instance. (ID 33424 Details: )

The suggestion is to add “‘NT Service\DPMRA\ to the sysadmin role on the SQL Server instance.” That’s very specific so that must be the fix.  The problem is I don’t have an ‘NT Service\DPMRA’ user in Windows or SQL.  Here’s the fix:

 

  1. In SQL Management Studio, connect to the SQL 2012 Server and then expand Security.
  2. Expand Logins and right click on NT AUTHORITY\SYSTEM and select Properties.
  3. Click Server Roles, check sysadmin and click OK.

I read a post saying you could also add NT Service\DPMA like the Recommended Action in DPM states, but I don’t have that as a SQL Logins and wasn’t able to find it as a Windows account to create one.

Once I added sysadmin to the Server Roles, I was able to right click and “Perform Consistency Check…” and everything took off.  You may also go to the jobs themselves and click “Run configuration protection job again”. 

 

Note, for me the Consistency Checks failed on some but this time with a new (more common) error saying “Recovery point creation failed”.  The fix was simply to create a new Recovery Point.




Error Publishing LightSwitch to Azure–Incorrect synax near ‘multi_user’

LightSwitch makes publishing your app to Azure drop dead easy once you know the process.  One scenario that may stump you occurs after you’ve published your app, make changes to your data model in LightSwitch, and then try to republish.  If those changes result in a SQL error, you’ll get the error below:

An exception occurred when deploying the database for the application.
Incorrect syntax near 'MULTI_USER'.

Not a very helpful error.  For me, this error has always been related to changes to the data source where a column has been deleted or perhaps changed to ‘Required’ when some of the current data has null data in that field.  The fix is to open up SQL Management Studio and fix the issue.  If you don’t want to keep the data, you can just drop the SQL tables and publish again.  I usually leave all the ASP.NET membership tables and just drop the table that was changed. 

So the key is to fix what ever SQL change the LightSwitch publishing is not liking.  For instance, I deleted two column in LightSwitch so the publish failed with the ‘Incorrect syntax near ‘MULTI_USER’.  So I just opened SQL Management Studio, deleted those two columns, and republished with no errors and all my other data was intact.  It would be great if the actually SQL error was passed down to you so you knew exactly what change was the problem, but hopefully knowing it might be related to SQL helps.




Delete TFS Azure Visual Studio Project

TFS Online doesn’t currently provide a way through the web portal to delete a project.  Here’s the steps to delete a TFS Online project.

1. Open VS 2012 Native Command Tools or just open a Command prompt with Run as Administrator. (Note: if you open a regular CMD prompt, you’ll need to go to the location for TFSDeleteProject.exe located at C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE assuming a 64-bit computer and that you installed VS 2012 on the C: drive).

 

2. Enter the following command and hit <ENTER>

tfsdeleteproject /collection:https://<yoursite>.visualstudio.com/DefaultCollection <projectname>

The key that’s easy to miss is the requirement to state DefaultCollection.  So if your TFS Online site is located at https://mysite.visualstudio.com and you want to delete the project MySite.Client.Testing this would be the command

tfsdeleteproject /collection:https://mysite.visualstudio.com/DefaultCollection MySite.Client.Testing

 

3. When prompted “Are you sure you want to delete the team project and all of its data”, enter Y if you’re sure (note it says it’s an irrecoverable operation).

 

That’s it.  Just hit refresh in Visual Studio or in the web portal and your project is now deleted.




Server 2012 GPO–Hide These Specified Drives

Prior to Server 2012, if you wanted to hide certain drives through GPO you’d edit User Configuration, Policies, Administrative Templates, Windows Components, and Windows Explorer changing the GPO Hide the specified drives in My Computer as stated in MS KB 231289.

 

After upgrading our domain to Windows Server 2012, I could not find that setting.  Turns out the Windows Explorer has been renamed to File Explorer  in the GPO. There you’ll find your settings. Smile

 

That being said, it’s sort of an old policy and only hides certain drives.  So you may need this blog post to hide other drives.




Pages: Prev123456NextReturn Top