Disable windows update in Windows 10

I was looking how to disable windows update in Windows 10. I want to control downloading new builds on one of my machine as I need it to be more stable with my Virtual Machines running.

  • Open MMC (type MMC in search windows)
  • Click file -> add/remove snapin –> choose “Group Policy Object Editor”
  • Navigate to “Computer Configuration” -> “Windows Settings” -> “Administrative Templates” ->”Windows Components” ->”Windows Update”
  • Rest of the story the below picture explains.

disable-windows-update

 

 

 

 

tags: Disable windows update in Windows 10; Automatic windows update; Windows 10 Updates;

SharePoint Server 2016 Overview

Here are my notes from Ignite 2015. These notes are from this session: What’s New for IT Professionals in SharePoint Server 2016

Launch dates

  • Beta 1 – Q4/2015
  • RTM – Q2/2016

Patching

  • Zero downtime patching for build-to-build updates (like Cumulative Updates)
  • Patch size significantly decreases.
  • Responsive interface and new mobile experiences based on what’s available in O365

Hardware Requirements

About the same as SP2013. Compute doesn’t really change.

Software Requirements

  • W2k12R2 and Windows Server 10
  • .NET FW 4.5.2 for w2k12R2 or .NET FW 4.5.7 for WS10.
  • 64bit SQL Server 2014 SP1 at a minimum

Installation

  • Standalone installations are no longer supported. Single Server install will require a regular install on SQL Server on the local machine.
  • The role guidance that we had in SP2013 becomes specific roles in code in sp2016. Conceptually, there are three role types
  • User Services roles – any request coming directly from a user.
  • Robot services – any request that does not originate from an end user.
  • Cache services – DCS

The goal is that a request from a user is managed from end-to-end by a single machine. This will reduce latency. No need to traverse the topology and send the request from server to server. This is called “min-role topology”

During install, you pick from one of server roles.

  • Special load does not use min role. Same flexibility you had in SP2013. This server can play any role just like before. Recommended for third-party or custom-dev services.
  • Web Front End – responding to a web user request from end to end. User services role.
  • Search – Indexing/Crawling.
  • App Server role – robot services role.
  • Single Server Farm – This does not include SQL Server Express any longer. You need to install SQL Server on that machine separately. All SP bits are installed, like Special Load.

Upgrade

  • 5 mode (SP 2010 mode) site collections need to be upgraded to v15 (SP2013) before an upgrade.
  • DB attach upgrade from SP2013 to SP2016. There will be direct path from SP2010.
  • Think of SP2013 as the base kernel for future SP release. For the most part, there is schema parity from 2013 to 2016.
  • Service app architecture does not change
  • When developing SP2016, MSFT took a point-in-time snapshot from O365 was taken and a new branch has been created.

AuthN/Z

  • SAML is the default and a first class citizen. Basically, there is one auth provider that is both claims and cloud ready. This helps make the cloud more transparent from an app-dev point of view.
  • MSFT is moving away from domain/classic mode auth and moving towards cloud-based auth. What about Windows Identity over SAML? This is supported but still deprecated.
  • SMTP Connection encryption. StartTLS connection encryption. Can use non default ports, other than TCP 25. Fallback to unencrypted SMTP is not supported.

Perf and reliability

  • Performance is expected to be significantly improved with the min-role concept covered during installation.
  • Health analyzer is specific for the role. It detects a service that deviates from the role. Can you start a service on a server that is incompatible with its role? Health analyzer does not run against Special Load.
  • Not in compliance means the health analyzer compares what’s running and what does it expect to find. There is a Fix link to resolve this.

Patching

  • Size of the patch has been reduced significantly. Number of MSIs and MSPs reduces down to 2 and 4 respectively. Previously this was 37 + (10 x number of language packs).
  • The upgrades install faster with no downtime. In the past, achieving 99.9% uptime has been too difficult. This is definitely possible now.
  • Build-to-build upgrades are an online operation. Completely transparent to users. Upgraders used to run offline where services were stopped.
  • With the fewer number of configuration combinations (e.g. using min role topology), this simplifies the testing and increases over the stability of the system. This is also how the patching footprint can be much smaller and faster.

Distributed Cache (DCS)

In SP2013, AD needed to be consulted for each cache request which really slowed down the perf. This has been eliminated by using a new transport layer.

Boundaries and limits

  • Moving away from FSSHTTP. Using Background Intelligent Transfer Service (BITS). What about Shredded Storage? Nothing discussed here.
  • SPSite (site collection) provisioning is much faster. SPSite. Copy is used at the db layer. Basically just adding new rows at the table level based on the template. Things like feature activation does not slow it down now.
  • MSFT is still thinking about incorporating Traffic Mgmt as with O365. New end point on web servers. This is not official yet, but in the planning/feasibility stage.

User Profile Services

Two-way FIM support is pulled out of SP. If you need this, you can use the full external ForeFront Identity Manager. SP only supports the simple, one-way sync from AD.

Project Server database gets merged into a content db. Brings Project Server closer to SP. SP2016 doesn’t include Project Server.

Durable Links

Files can be moved in between site collections or renamed and the durable link still works. This is based on a resource ID that has an affinity on the document.

Reporting

Lots of new telemetry in SP2016. Lots of new reporting on usage, storage, health, perf. This is the first time I’ve seen this degree of reporting in any edition of SharePoint. This will definitely diminish the value of Report Centre.

Compliance

  • O365 Compliance Centre can also cover on-prem content.
  • In-place hold and e-discovery on both O365 and on-prem.
  • Classification ID – a discrete representation of a particular piece of IP. For example, there is a credit card class ID, but in addition, they look for something else to corroborate it. For example, expiration date. There will be many others, SSN, Driver’s license, etc.

Search – unified search result set.

No longer separate result blocks. One search to rule them all. Also brings the power of Delve and Office Graph onto On-Prem. Is there a unified search index in O365? Did I hear that right?

 

 

 

Keywords# SharePoint Server 2016 Overview

Notes Captured# Randy Williams

Nintex Workflow Migration – Data and Configuration Migration

Migration process for Nintex Workflow data from one SharePoint farm to another where the two farms are mutually exclusive and running with their own accounts and instances, like when you want to bring in PROD data to non-prod for performing content refresh for testing CU’s or other major enhancements in non-prod. Or when we have to consolidate two SP farms into one by bringing over web applications from one farm and merging into another SP farm.

The idea is to preserving the current Nintex configuration of the destination farm and trying to move only Nintex data and configuration from source farm. Nintex has a command line tool which will migrate data from source Nintex database into destination database. Here we have two options where we can move the data to existing Nintex database or we can create an additional Nintex content database and move data into newly created database. I like second option where if we have to rollback it’s easier. (SP 2010 and Nintex workflow 2010 are the context of this document)

The process can be divided into 5 Major activities:

  • DB creation
  • Migrated Data
  • Migrate configurations
  • Testing
  • (optional) Roll back

DATABASE CREATION:

How do you add a new Nintex content database: Either from Central Administration or Nintex command line tool: Nwadmin

For me: It’s easier to create the dB from CA rather than command line.workflow1

Nintex has a command line tool to migrate only the workflow data but all other configuration has to be either manually created/updated on the workflows or import the necessary dB table data into destination. We will look at that little later in the process.

Nwadmin is a command line tool that ships with Nintex Workflow 2010. It is used to perform various administration operations.

By default, the NWAdmin.exe tool is installed to the SharePoint hive, typically at the following path.

%ProgramFiles%\Common Files\Microsoft Shared\Web Server Extensions\14\BIN

Note: For some versions of Nintex Workflow 2010, the Nwadmin tool is installed in the installation directory, typically at the following path:

%ProgramFiles%\Nintex\Nintex Workflow 2010 (I find it here not in 14 hive)workflow2

https://community.nintex.com/servlet/JiveServlet/downloadBody/1027-102-7-2571/NWAdmin_Operations_2010.pdf

Migration Process:

Pre-Migration:

Backup Source (Prod) and Destination (Non-prod) Nintex DB’s

Either bring the source database to destination sql server and assign destination farm account access to the dB or grant destination farm account access to the dB where ever it is.

Grant FARM_ADMIN account DB_owner role on Destination farm DB’s: SP_NW2010 ; SP_NW2010Source

Migration:

Consider the following points which are important while migrating the nintex data.

While migrating web applications needs to be stopped (iis) and
Timer service on all the servers to be stopped aswell. Consider gathering all the site collections where Nintex Workflows are active.

From SQL run the following query “SELECT SiteID FROM [NintexDBname].[dbo].[Storage]”
Or gathering the list of site collections in  “Central Administration : Nintex Workflow Management : Nintex Workflow Database Setup : View Database Mapping”

While web apps and timer service are off run the following command to migrate data:
nwadmin -o movedata –SiteID “GUID” –SourceDatabase “Prod Nintex DB Connection String” –TargetDatabase “Adelaide Nintex DB Connection String” –RetainSourceData

Connection String format: “Server=DBserver;Database=DBName;Integrated Security=True”workflow3

Make sure you create a sql alias so that you pass a single string in server name rather than using “server\instancename”.

The command window output looks like below when it is success:

C:\Program Files\Nintex\Nintex Workflow 2010> nwadmin -o MoveData -SiteID “F1E52EF2-996F-46A1-A0D1-250D8970E725” -SourceDatabase “Server=TSTSP10DB-BNE;Database=SP_NW2010DB;Integrated Security=True” -TargetDatabase “Server=TSTSP10DB;Database=SP_NW2010_BNE;Integrated Security=True” -RetainSourceData

Before continuing this operation, please stop the following on each server:

SharePoint IIS web site

SharePoint 2010 Timer Service

If these services are not stopped, workflows may continue adding data, leaving the instance in an invalid state.

It is recommended that the source and target content databases are backed up before continuing.

Data will be moved for site collection ID: f1e52ef2-996f-46a1-a0d1-250d8970e725.

Restart each service to continue workflow operation.

C:\Program Files\Nintex\Nintex Workflow 2010>

Post-Migration

  • Grant db_owner access and assign WSS_Content_Applications_Pools role to all the app pool accounts of the migrated web applications on this dBworkflow4
  • Start web applications, SharePoint Timer services.
  • Login to each site collection and go to Nintex workflow and check for any errors.

Migrate Configuration:

Till here we had complete one part of the migration where we moved WF data there are other components we need to consider to move, some of them listed below.

  • Workflow constants
  • Workflow Schedules (need be recreated manually)
  • Managed Allowed Actions
  • User Defined Actions
  • Lazy Approval settings
  • Error Notifications
  • Delegation
  • EventReceivers

If you have some expertise in sql queries that would be handy in identifying differences between tables in both the DB’s

Some of the configurations I can help identify here and provide out of the box solutions and for remaining we need to use dB table move method or identify the need based on case by case.

Workflow Constants:

These will not be migrated need to migrate manually. If they are few it’s easy to add them to the workflows otherwise consider the below commands to get them updated.

Use nwadmin -o exportworkflowconstants and then import with nwadmin -o importworkflowconstants

NWAdmin.exe -o ExportWorkflowConstants -siteUrl siteUrl -outputFile pathToFile [-includeSite] [-includeSiteCollection] [-includeFarm]

NWAdmin.exe -o ImportWorkflowConstants -siteUrl siteUrl -inputFile pathToFile -handleExistingSkip|Overwrite|Abort [-includeSite] [-includeSiteCollection] [-includeFarm]

Use the nwadmin documentation I refer earlier for detailed steps.

MessageTemplate: The entries can be added back via UI (Central Admin – Nintex Workflow Management – Message Templates). Impact is that customized templates will not show in emails.

EventReceivers: Impact is if event receiver entries are missing then task related action might not work properly. The entries are added to this table at the time of Nintex Workflow features activation, if destination configuration table has already entries for sites it can be ignored. Otherwise, deactivate and re-activate the Nintex workflow features. (“Nintex Workflow 2010” & “Nintex Workflow 2010 InfoPath Forms”) on all the site collections migrated where you have Nintex Workflows active/running.

Changes in Functionality:

Please note that the send notifications email will change (if the source has a different email address) for users as only 1 address can be used per farm.

Environment From Address Reply To Address

Testing:

I suggested below configuration areas are tested in a test environment before the actual live production migration:

  • Lazy approval (not required as it isn’t configured in either farm)
  • send notifications
  • query AD
  • execute SQL
  • Web request
  • call web service
  • Any other actions that are using stored credentials to connect to other systems

Rollback Procedure:

Drop the NEW Nintex content destination DB, The source database is still not touched and still hold the original WF data.

Having said that when migrating some web applications from one farm to another you cannot bring all the Nintex configurations as the destination farm already had its own, so there is a need for good amount of testing what is needed and how it can be brought, and I hope this document will provide some of the insights into it.

The reason this is created is we could not get proper documentation on this process either online or from Nintex support.

SharePoint Calculated Column – how to work around

Q: I work in Communications and we collect the news related to our department and send it out by email twice daily, once in the morning and once in the afternoon. I have set up a SharePoint library to group by Date, so it groups uploaded files by the date they were added, which is great. I was wondering if I could group the files under these dates by morning and afternoon (ie. before 12:00 and after 12:00) so by the time that they were added, so that they correspond with our morning and afternoon news alerts? Continue reading “SharePoint Calculated Column – how to work around” »