How to Remove Built-in Windows 10 Apps

Windows 10 includes a number of built-in apps ranging from basic apps like Calculator and Weather to more task-focused apps like Mail and Photos. While these built-in apps are fine for most situations, in a business environment, they may be inappropriate, redundant or unsupported. Very often, these apps are my pose a security risk.  The problem is that Microsoft doesn’t make it easy to uninstall some of these apps. There is no uninstall button when uninstalling using normal methods. The built-in apps must be uninstalled through PowerShell. Before we get started, I do not recommend uninstalling all the packages. Many of them are needed for the Windows 10 “Experience” and others, like the .NET framework, are needed for other programs. Be picky about which applications to uninstall. You can reinstall all the applications and I will have a PowerShell command to just that at the end of this article

Different sets of packages

There are actually two different kinds of applications that we will be working with.
  • AppXPackages – Applications installed with the operating system
  • AppXProvisionedPackages – Applications installed as part of the user profile first time set up.
The first step is to get an inventory of the applications that are installed. To do that start PowerShell with elevated privileges. For the AppxPackages we can enter the command Get-AppxPackage.
The provisioned packages have a slightly different command and also need the -Online parameter. The -Online parameter denotes that we want a list from the current online operating system as opposed to an image file located in a local directory. This will present a list of all the details regarding each package. This is a rather verbose listing and all I am interested in is the Nameof the package for the AppxPackages and the DisplayName for the provisoned packages.
Details of get-appxpackage listing
Listing for get-appxprovisionedpackage
To make things a little easier, let’s pipe the results through Select-Object and select the Name and the DisplayName properties. This will give us a list like the one below. This list is easier to work with. Now we can easily copy and paste the applications were are interested in.
Output of get-appxpackage | select object name

One trick that I use is to save the results to a text file and then open that file in Visual Studio Code. For example: Now that we have our list, we can start building our script. Selecting the applications from the list that you want to be uninstalled, build a simple array and populate the names of the applications into the array as seen below. I have given my array a variable called $ProvisionedAppPackageNames.
With my array populated with the specific applications I want to be removed, we can now set up the for loop to step through each package to uninstall.
If for any reason you want to reinstall all applications, type in the following command in an elevated PowerShell console

How To Make Visual Studio Code Look And Behave Like The PowerShell ISE

Despite its lack of features and options, PowerShell ISE used to be the primary tool to develop and edit PowerShell Scripts. It offered an integrated development environment (IDE) that included some basic features to build scripts and modules.

Microsoft is no longer actively developing the PowerShell Integrated Scripting Environment (ISE) and is being replaced by the more powerful and versatile open source Visual Studio Code (VS Code). With its ever-expanding options and extensions, VS Code is quickly becoming the new standard tool for developing not only PowerShell, but just about any other language you choose.

Despite all the new features available in VS Code, leaving the familiar environment of PowerShell ISE is difficult.  It is like watching your child go to college. You are proud of the achievement but sad about having left a comfortable environment.

VS Code can be intimidating at first. As the default settings of VS Code can be a little hard to work with if you are used to working with PowerShell ISE. However, it’s highly customizable, and with the addition of Extensions and a few configuration settings, you can make VS Code look and behave just like PowerShell ISE.

The Look

To get VS Code to look like PowerShell ISE, the PowerShell Extension needs to be installed.  To install, select the setting gear at the bottom left, then pick Extensions.

At the search box, type in Powershell and then install. This extension adds a few features to the default settings of VS Code.

 

To get the distinctive look of PowerShell ISE, select the settings gear and then Color Theme. Choose the PowerShell ISE theme.

 

 

Now that you have the look of PowerShell ISE, we need to set the behavior to match ISE.

The Behavior

The default install of VS Code lacks some features of PowerShell ISE, such as Zoom, Tab-Completion, Intellisense, and Code Snippets.

For setting the environment to match that of PowerShell ISE, we need to add some environment settings to the VS Code settings.

Keyboard and Mouse Actions

Open the command palette using the ctrl+Shift+P key combination. In the command palate box, enter “Preferences Open Settings (JSON).” This will open up a two-pane window with the user settings on the right. Insert the following code between the brackets on the right pane.

Environment Settings

Code snippets

One of the best features of PowerShell ISE is the ability to use Code Snippets. VS Code has made Code Snippets more versatile.

To add Code Snippets, select the setting gear and then pick “User Snippets.” In the command palette, enter “Powershell.json.” I’ve created a sample user snippets JSON file for you available here.

VS Code is now the preferred PowerShell editor. With a few customizations, we can make it behave just like the familiar PowerShell ISE.

How I Arrived at TechSnips

I Get Paid to do This?

Two things I get paid to do, solve problems and learn new stuff.

I am fortunate to know very early that I wanted to work with computers. The only problem was I could not concentrate long enough to complete a formal school course.

I was kicked out of two community colleges for academic suspension. I tried to take the required courses like history, calculus, economics etc… but I got bored and never did the required homework or study.

It All Started in the Navy

I eventually ended up in the Navy. I signed up for an extra year to get the technical training I wanted. It was perfect. 8 hours a day 5 days a week for a year. Straight electronics. Resistors, capacitors and circuit boards, Oh My!

The best part? I was given a box of parts and spent 3 months putting it together to build a radio. If it worked, I passed, if not I failed. ( Hint: I passed)

Oh and the discipline, self-esteem and doing things I never thought I could do turned a manchild into a proud, self-confident man. The Navy instilled in me the self- discipline I was sorely lacking.

My Consultant Years

I became an IT consultant right out of the Navy. My lack of a formal degree was an obstacle at times but I never said no to a problem. I knew I could solve any problem thrown at me. After all, I just spent 6 years in the Navy doing just that.

My love of learning also kept up to speed on technology. Two hours every night either on a self-study course or learning a new technology that had just come out.

My first network security job even had me build my own Linux PC on the first day. Back then it was a very manual process with a lot of compiling. If you ever built a Linux PC using “Linux From Scratch” you will know what I mean. I learned a lot and I loved it.

A lot of what I learned did not immediately apply to my job. For example, I am not a programmer but I learned how to use Git just because I thought it was cool. It was the same for a lot of technology I learned. This lifelong learning kept me employed.

How did I arrive at TechSnips?

Well, after many years of IT consulting, it is getting a little uncomfortable being the oldest tech on the IT team. I have been asked in more than one interview why am I not an IT Manager or have some supervisory experience. Yes, age discrimination is a thing.

You know what IT managers do? Answer phones, create budgets, develop strategic plans that no one will use, use words like “Synergy”, “heterogeneous” and  “teamwork”. No, I need to be in front of a keyboard. I can make servers dance, sing and do your dishes.

So while my peers were getting management jobs, I was designing networks for data centers, installing hundreds of servers, laying out cabling, and learning to break into systems. I was good at it

I had been looking for something that I could do remotely and still generate income. I do have a family to support. I started (and failed) at several blogs. I realized that I did not have the business acumen to make it successful.

Feeding My ADHD

This is where TechSnips comes in. I actually came across Techsnips from a tweet posting about a snip for a problem that had. I watched the snip and it leads me to a solution that I needed.

I had been working from home for a while now and I was not looking to go back to the office again. I wondered if they were hiring and, even better, if they were hiring remote techs. I navigated to the contributor page and immediately knew I had to apply.

The more I read about the contributor role, the more excited I became. I was hesitant to submit a video audition (the presenter role), but Anthony Howell said it did not matter. Produce a snip and let me see what you got.

The format of producing a snip fed very well into my ADHD. Short, technical videos on a specific topic that I am interested in. If I get bored with a topic, choose another topic or suggest one. Each Snip was to be no more than 15 minutes. I thought this was perfect. I could do this.

So, I produced my first snip and it was accepted. a couple of days later, I got a call from Anthony Howell welcoming me to the team. The more I heard about what TechSnips is all about, the more excited I became and I knew I had made the right decision.

So, today, I am a producer of several snips and have many more in the works. Producing snips also has given me the confidence to improve my presenting skills.

I am not used to talking in front of people or teaching online. The team at TechSnips has provided valuable advice on how to present technical videos and engage an audience.

TechSnips is giving me the opportunity to not only do what I love but actually get paid to do it.

Installing PowerShell Core Everywhere

DevOps is requiring that SysAdmins be experts in more than one operating system. That used to mean learning more than a few shell scripting languages. PowerShell Core is changing that.

With PowerShell Core, it is no longer necessary to learn a new scripting language to support heterogeneous environments.

PowerShell Core is a new edition of PowerShell that is cross-platform (Windows, macOS, and Linux), open-source, and built for heterogeneous environments and the hybrid cloud.

It has recently become available on Windows Internet of Things (IoT). The cross-platform nature of PowerShell Core means that scripts that you write will run on any supported operating system.

What’s the Difference?

The main difference is the platforms they are built on.

Windows PowerShell is built on top of .NET FrameWork and  because of that dependency, is only available on Windows and is launched as powershell.exe

PowerShell Core is built on .NET Core and is available cross platform and is launched as pswh.exe

Installing PowerShell Core

To install on a Windows client or Windows Server, navigate to the GitHub repository – PowerShell Core – and download the .msi package appropriate for your system.

Windows IoT devices already have PowerShell installed which we will use for installing Powershell Core

For Linux Distributions, it just a matter of adding the repository and installing with the package manager.

For Ubuntu, Debian

For CentOS and RedHat

For OpenSUSE

and finally, for Fedora

 

For macOS, Homebrew is the preferred package manager.

Installing Homebrew package manager is a single line command from a terminal, then install Powershell Core.

Embracing DevOps means being able to manage different platforms and OS’s and learning different shell scripting programs maintain them.  With PowerShell Core, you write once, deploy everywhere. It’s another tool in your toolbox.

If you don’t learn it, someone else will.

Duplicating SharePoint Farms with SharePointDSC.Reverse

 

SharePoint farm configurations are notoriously difficult in not only documenting accurately but also migrating those configurations to a new SharePoint farm.

 

Commercial tools and utilities help, but each tool has its pluses and minuses and some of them are not effective and often buggy.  Additionally, the tools can be expensive and come with a high learning curve.

SharepointDSC.Reverse

SharePointDSC.Reverse is a script developed by Nik Charlebois that utilizes SharePoint DSC resources to gather detailed information about the farm and outputs into a configuration file that can be consumed by PowerShell DSC and SharePointDSC resources.

The resulting PowerShell DSC configuration files can be used to create a near perfect copy of the farm to replicate in the new environment or can be used as a template for Azure automation.

SharePointDsc.Reverse currently supports SharePoint Server 2013/ 2016 and soon SharePoint 2019, running on Windows Server 2008 R2, Windows Server 2012 or Windows Server 2012 R2 or higher.

Getting Started

There are a few prerequisites before running the script. PowerShell v 5.1 is required. Two PowerShell DSC modules are also required and will need to be installed.

Log into the Central Administration server and open a PowerShell session as administrator. The SharePointDSC reverse script is installed with a similar command but using a script instead of module. To install the SharePoint Reverse script, we’ll use

How To Use

Now that we have all the necessary modules installed, it’s fairly easy to use. To start the process, enter sharepointdsc.reverse.

As the script runs, it asks for the credentials for the various managed accounts. Using the DSC resource provided by SharePointDSC, the script performs a detailed scan of the farm, gathering all the settings and configurations.

For a large farm, this will take several minutes to complete. Once it’s complete, It prompts for a directory to save the results. the resulting files can be consumed by SharePointDSC.

To validate the configuration, compile the spfarmconfig.ps1 file to create the .mof resources. 

The resulting files from SharePointDSC.reverse can be used to duplicate the SharePoint farm in different environments, on-premises or in the cloud. The configuration file, the error log, and the environment data file, all contain detailed configuration settings of the farm.  Custom solutions (.wsp files) are copied into the directory as well.

Duplicating the SharePoint farm

SPFarmConfig.ps1 file can also be uploaded to Azure Automation to duplicate farm configurations for your Azure based SharePoint farm. To duplicate the SharePoint farm in a new environment, apply the configuration to the farm by starting the DSC configuration.

Additional Details

In a multi-node farm, the configurationdata.ps1 file already has the node names, roles, and services that are running on each server in the farm. The file is formatted very similar to JSON and editing this file for the new environment can easily be completed using Visual Studio Code.

The spfarmconfig.ps1 file has the detailed farm configuration and also lists products installed and version numbers. It will also have details about each web application, site collection, and farms settings. Patches applied and version numbers of products installed are also displayed.

One additional benefit of these files is that they can be part of a disaster recovery plan. Restoring the farm from a complete loss can now be accomplished in hours instead of days.