I have a requirement to run a PowerShell .ps1 file each evening to perform a backup in an AWS environment. Having written the .ps1 script and placed it on the server. I've now created the Task with the following parameters using the Task Scheduler GUI.
Program/script C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe
Add arguments -File ".\EC2_Backup.ps1"
Start in D:\Scripts
To invoke the AWS connection details for the region and access keys, I also added the following to the top of the script
Initialize-AWSDefaults
A more reliable way of remembering useful IT things that I encounter with an added sprinkle of poor grammar.
24 Mar 2015
My take on Stub Zones vs Conditional Forwarders
I'm lucky/unfortunate enough (delete depending on the day) to architect and manage a geographically distributed Active Directory environment consisting of multiple domains and forests. The forests are connected using a variety of AD trusts and they all hinge on our DNS infrastructure.
I've typically been a big advocate for AD integrated stub zones rather than Conditional Forwarders due to the ability to centrally manage them by virtue of their AD integration whether it be Domain or Forest wide scope and also their ability to update the name servers belonging to the DNS zone.
To be clear, stub zones contain three record types (SOA, NS and A) which reference the name servers responsible for the source DNS zone. Periodically the SOA, NS and A records are updated from the Master Server list for the particular zone. When a query is performed against the server hosting and matching the Stub Zone, the server references the NS and then A records contained in the zone to direct the query to a suitable name server for an answer. This is perfectly acceptable when the source zone is located on a number of Domain Controllers in a central location such as a data centre but may add complications when the Domain Controllers are geographically distributed, such as in Hub/Spoke topologies where the spokes consist of DCs in remote offices connected to small/slow links. While answers to name queries are cached by the server hosting the stub zone, attempting to perform name lookups across such links may introduce delays or add to the traffic on the links. Microsoft indirectly acknowledge this eventuality by virtue of the behaviour of stub zones in the Technet - Contrasting stub zones and conditional forwarders but in the context of security and not being able to directly influence server to server connections when compared to the static configuration of Conditional Forwarders.
Based on the above statements, interpretation and my experience, my recommendations are as follows..
Conditional Forwarders - Great for server to server connections for name resolution such as specifically defining server A will always forward to server X,Y,Z for contoso.com when contoso.com is a hosted on a geographically heavy distributed AD infrastructure or where not all sites are routable from server A. The obvious downside with all Conditional Forwarders is maintaining the list of forwarders on a per server basis.
Stub Zones - Ideal when referencing DNS zones hosted on resource forests/infrastructures which are hosted centrally, fully routable/reachable from the server hosting the stub zone. Obvious benefits are that the name server list is maintained as part of the stub and the zone can be AD integrated to ensure that it is available throughout the Domain or Forest. Be careful when creating a stub zone which references a zone which is hosted on geographically distributed infrastructure.
I've typically been a big advocate for AD integrated stub zones rather than Conditional Forwarders due to the ability to centrally manage them by virtue of their AD integration whether it be Domain or Forest wide scope and also their ability to update the name servers belonging to the DNS zone.
To be clear, stub zones contain three record types (SOA, NS and A) which reference the name servers responsible for the source DNS zone. Periodically the SOA, NS and A records are updated from the Master Server list for the particular zone. When a query is performed against the server hosting and matching the Stub Zone, the server references the NS and then A records contained in the zone to direct the query to a suitable name server for an answer. This is perfectly acceptable when the source zone is located on a number of Domain Controllers in a central location such as a data centre but may add complications when the Domain Controllers are geographically distributed, such as in Hub/Spoke topologies where the spokes consist of DCs in remote offices connected to small/slow links. While answers to name queries are cached by the server hosting the stub zone, attempting to perform name lookups across such links may introduce delays or add to the traffic on the links. Microsoft indirectly acknowledge this eventuality by virtue of the behaviour of stub zones in the Technet - Contrasting stub zones and conditional forwarders but in the context of security and not being able to directly influence server to server connections when compared to the static configuration of Conditional Forwarders.
Stub zones do not provide the same server-to-server benefit because a DNS server hosting a stub zone in one network will reply to queries for names in the other network with a list of all authoritative DNS servers for the zone with that name, instead of the specific DNS servers you have designated to handle this traffic.
Based on the above statements, interpretation and my experience, my recommendations are as follows..
Conditional Forwarders - Great for server to server connections for name resolution such as specifically defining server A will always forward to server X,Y,Z for contoso.com when contoso.com is a hosted on a geographically heavy distributed AD infrastructure or where not all sites are routable from server A. The obvious downside with all Conditional Forwarders is maintaining the list of forwarders on a per server basis.
Stub Zones - Ideal when referencing DNS zones hosted on resource forests/infrastructures which are hosted centrally, fully routable/reachable from the server hosting the stub zone. Obvious benefits are that the name server list is maintained as part of the stub and the zone can be AD integrated to ensure that it is available throughout the Domain or Forest. Be careful when creating a stub zone which references a zone which is hosted on geographically distributed infrastructure.
18 Mar 2015
Pause and Resume Bitlocker Encryption Operation
Maybe a bit of a useless one but you can control the encryption process of a drive if you find that the process is hindering your progress on a machine using the commands below.
manage-bde –pause driveletter :
When you are ready to start encrypting the drive again, type the following command:
Manage-bde –resume driveletter :
I'm not sure how much value this offers as with MDT 2012, you can now pre-encrypt drives to negate the need to perform the encryption step post Windows deployment
manage-bde –pause driveletter :
When you are ready to start encrypting the drive again, type the following command:
Manage-bde –resume driveletter :
I'm not sure how much value this offers as with MDT 2012, you can now pre-encrypt drives to negate the need to perform the encryption step post Windows deployment
Lack of Updates - sorry.. but more soon!
I thought it best to apologise for the lack of updates which have been posted for almost the last two years. Since my last post, I've had a promotion and moved to the other side of the world and found that my workload has greatly increased.
I checked my analytics figures today for the first time since 2013 and can see that the blog is as popular as ever, so that's spurred me on to start putting more effort into posting.
Hopefully I'll have something up soon..
I checked my analytics figures today for the first time since 2013 and can see that the blog is as popular as ever, so that's spurred me on to start putting more effort into posting.
Hopefully I'll have something up soon..
17 Jul 2013
Direct Access 2012 Installation Fun
We've recently upgraded our environment from Windows XP to Windows 7 Enterprise and therefore I thought it worthwhile to see what all the fuss was about regarding DA.
Instead of using UAG and Server 2008 R2, I took a shortcut and went straight for DA with Server 2012. Below are a few of the issues that I experienced and their associated fix.
Network Location Server
It's worth noting that when considering the placement of the Network Location Server, that it's not a good idea to place it on the DirectAccess server. This is due to DA configured clients, misjudging that they are outside the corporate infrastructure when the NLS is unreachable. When the NLS is unavailable they will attempt to connect to their DA server which will also be unavailable (it's the same server in this case) therefore putting the machines into a loop and disrupting their connectivity.
Do yourself a favour, build a separate NLS and consider using VMware Fault Tolerance (as an example - there are other virtualisation technologies) to ensure that it's always available in the event of a hardware failure.
General: Block the Connection
Scope: 2x External IPs
Programs and Services: Services - Network Location Awareness - NLASVC
Once configured, disable and re-enable the external interface and it should be associated with a public profile.
Instead of using UAG and Server 2008 R2, I took a shortcut and went straight for DA with Server 2012. Below are a few of the issues that I experienced and their associated fix.
Network Location Server
It's worth noting that when considering the placement of the Network Location Server, that it's not a good idea to place it on the DirectAccess server. This is due to DA configured clients, misjudging that they are outside the corporate infrastructure when the NLS is unreachable. When the NLS is unavailable they will attempt to connect to their DA server which will also be unavailable (it's the same server in this case) therefore putting the machines into a loop and disrupting their connectivity.
Do yourself a favour, build a separate NLS and consider using VMware Fault Tolerance (as an example - there are other virtualisation technologies) to ensure that it's always available in the event of a hardware failure.
When configuring the server, no matter what config I used, the 'Domain' profile was being associated with the external NIC, this is a problem due to the Network Location Awareness functionality within the Operating System which I could not resolve elegantly. After much Googling I resorted to a Block rule in the Windows Firewall.The adapter configured as external-facing is connected to a domain
General: Block the Connection
Scope: 2x External IPs
Programs and Services: Services - Network Location Awareness - NLASVC
Once configured, disable and re-enable the external interface and it should be associated with a public profile.
There is no valid certificate to be used by IPsec which chains to the root/intermediate certificate configured to be used by IPsec in the DirectAccess configuration.The fix was to allow the DirectAcess server to auto enrol it's own Computer Certificate, even though a Server Authentication cert was present in it's Local Computer Cert Store. The Enhanced Key Usage on our Computer Certificate Template includes Server Authentication and Client Authentication, I believe that it's the Client Authentication that made the difference.
Labels:
DirectAccess,
Server 2012
8 Jan 2013
Adventures when upgrading SCCM 2012 to SP1
I recently upgraded our newly installed SCCM 2012 RTM infrastructure to SP1 after it's release in late December and after doing so I encountered a number of issues, below are the issues that I experienced and the associated fixes.
Broken MDT Database Connectivity
We have MDT 2012 U1 integrated with our SCCM infrastructure and use the MDAC based database functionality to lookup various details but after the SP1 upgrade, the following errors could be found in the BDD.log;
Misassignment of drive letters during OSD of Windows 7
Another side effect of the SP1 upgrade was that previously working Windows 7 images were installing but assigning drive letters D: or E: instead of the normal C: drive assignment. This appears to be a result of the introduction of the new Task Sequence variable OSDPreserveDriveLetter. When investigating my existing Task Sequences I found that a new step had been added to named 'Set Variable for Drive Letter' which declares the value of this variable as False. By changing this value to True, this ensures that the intended drive letter assignment is honoured and therefore future OS drives are assigned C:.
Broken MDT Database Connectivity
We have MDT 2012 U1 integrated with our SCCM infrastructure and use the MDAC based database functionality to lookup various details but after the SP1 upgrade, the following errors could be found in the BDD.log;
Unable to create ADODB.Connection object, impossible to query SQL Server: ActiveX component can't create object (429)After some research and forum posts I managed to confirm that the upgrade process had removed MDAC from the MDT Boot Image and therefore crippling database connectivity when inside WinPE. To resolve this, I found it necessary to recreate the the MDT Boot Image from within the SCCM Admin Console.
Misassignment of drive letters during OSD of Windows 7
Another side effect of the SP1 upgrade was that previously working Windows 7 images were installing but assigning drive letters D: or E: instead of the normal C: drive assignment. This appears to be a result of the introduction of the new Task Sequence variable OSDPreserveDriveLetter. When investigating my existing Task Sequences I found that a new step had been added to named 'Set Variable for Drive Letter' which declares the value of this variable as False. By changing this value to True, this ensures that the intended drive letter assignment is honoured and therefore future OS drives are assigned C:.
Labels:
MDT 2012,
SCCM 2012,
SCCM 2012 SP1
21 Nov 2012
Offline USMT excluding Local Accounts in SCCM 2012 & MDT 2012
Local Machine accounts can be excluded from the USMT process by specifying an additional Task Sequence variable and then either explicitely excluding the local account(s) or explicity including the domain accounts. An issue with the offline USMT method is that the machine is unable to determine the domain by it's name and therefore it's necessary to use the Domain's SID based on the fact that each user account within a given domain will have a similar SID apart from the RID which forms the last section of the SID string.
See the Wikipedia explanation
A domain's SID can be retrieved by using PSGetSID.exe from SysInternals and using the following syntax; psgetsid.exe <Domain Name>
See the Wikipedia explanation
- First of all create the necessary TS variable in a new step before the Scanstate operation named OSDMigrateAdditionalCaptureOptions
- Give it a value of /ue:* /ui:<Domain SID>*
A domain's SID can be retrieved by using PSGetSID.exe from SysInternals and using the following syntax; psgetsid.exe <Domain Name>
Subscribe to:
Posts (Atom)