Fetch DNS scavenged records with help of powershell

1 comments
After enabling DNS Scavenging successfully in our infrastructure. It was huge pain to know what all DNS records have been scavenged. So, i wrote up a script to gather all information about scavenging and all scavenged records, then create a report for me and email me every Monday morning because Sunday is our scavenging day.

I am going to explain what all things will this script do for you and how can it be useful for your work.

Whenever DNS Scavenging happens, it generates two events, 2501 and 2502.
Event Id 2501 generates when DNS scavenging runs automatically. And the script which i created mainly uses 2501 event because DNS scavenging is configured to run automatically.
Event Id 2502 is same as 2501 and has all together information. But it generates when you run scavenging manually.

You will find both of these events under Event Viewer > Applications and Services Logs > DNS Server.

Domain Controller System State Backup (Server 2016)

0 comments
I recently schedule domain controller (PDC) system state backup to repeat the backup task every first Sunday of a month. I wrote up a script using new powershell windows backup commands. Commands are really good rather using old legacy command prompt utilities.

You can read the script from below attached screenshot. If you want to download it, you will find its link below the image.

Note: Make sure you install Windows backup feature first. Else these commands will not work.

Change DNS Dynamic Records to Static Records

0 comments
Most of the times happen that we got a request from other team and they ask to convert their servers DNS records from dynamic to static. It is fine to do this for 5-10 DNS records but what for 1000 records. I got the same request and i was scared to convert all 1000 records manually.

So i created a small script to do my work and want to share it if it can help you as well.

DHCP Security (Recommendations)

1 comments

I decided to write this article about DHCP security features which i recommend to all admins who are responsible for Domain Controllers, DHCP, DNS, NPS, PKI and etc. What i have observed in my career till so far that many administrators do not configure all DHCP security settings. Below i have mentioned some examples which i have experienced in my career.

1. We had a host record "WATCH" which was pointing to the IP address of a server. A user came with this home macbook and connected office LAN cable to its macbook. Our bad luck was, his macbook name was also "WATCH" and its IP address was replaced by the IP address of the Macbook.

2. We have a WiFi scope for visitors. So one day, someone came for an interview and connected his iPad to our Visitor Wifi. Suddenly our exchange team emailed me that their "SPAM" server logs have stopped working, please rectify it. When i checked, they used to have a DNS A record by the name of SPAM which was modified by a visitor Wifi IP Address. After check it more, we found that the visitor who came for an interview, his iPad name is SPAM. UUUfffffff..

3. In my first organisation, there was a problem of duplicate host records. And most of the teams like security, exchange and sccm were totally fucked up due to this issue. Issue was like, if you want to deploy something on system A it used to go on system D. If you need to run a script remotely on system D, it used to go on system G. Totally messed.

Here are the settings which i recommend for every admin.
A. Let DHCP owns DNS records.
B. Name Protection
C. Disable DNS record creation for some scopes.

Delegating DHCP Server Administration

0 comments

Although it is easy to delegate dhcp server administration tasks but still some admin gets confuse because they treat DHCP delegation as AD delegation or they don't know who to give DHCP delegation access to other users.

Very important to know.
It will not be possible to assign DHCP administration and monitoring privileges to other user accounts on the server.

So, question comes then how to do that. Whenever you install and configure DHCP server role, by default it creates two active directory security account "DHCP Administrators" and "DHCP Users".

SMB Insecurely Configured Service vulnerability

1 comments
This vulnerability can be cause due to many services which uses SMB in some ways. There are many related articles which you will found and they will tell you which service has problem and what should be the fix.

Articles : Tenable, Nessus, Microsoft

When i worked on this security incident, i found that there are some policies which are wrongly configured in GPO which is applying on all laptops and workstations.

There are only 4 things which you have to check on all service settings which are coming from GPO or manually configured. Then you have to remove them from ACL of those services.

1. Authenticated Users
2. Domain Users
3. Users
4. Everyone







If service is disabled, then there is not need to check this on it. If it is enabled either in Automatic Mode or Manual Mode, it is important to check.

RoboCopy Of Data Excluding Multiple System Directories

1 comments
Sometimes we get a requirement to copy the complete volume to other file server and add it in DFSN and DFSR.

When you create a root folder of the volume on other server new volume which has lots of NTFS permissions. It gets difficult to copy the exact ACL on the folder of new volume. If we robocopy the entire volume then we ended up with some system folders which we don't want on our new volume, like $RECYCLE.BIN, System Volume Information and DFSR Private. Cleaning these folders are also time consuming.

You can use /xd filter in robocopy command to exclude all system folders which will reduce you work load while copying the data. Like below,

Re-Apply Configuration Profiles

0 comments
Many time it happened that users who have administrator access on their macOS systems remove MDM profile, due to which all other profiles remove from the system.

If you have worked on Profile Manager as well, JAMF MDM profile is same as Profile Manager Enrollment Profile. Once it is pushed to the macOS system, all other policies which are available for your system will be pushed.

Now, you know that why MDM profile is necessary. So, we have to find a way to either restrict it or create an ongoing policy to check the MDM profiles every time and enroll it if it is missing.

Change Primary Member On DFS Replication Group

3 comments
First, you need to know which member server is acting as a primary member of a replication group. From which other member servers will consider this server which would have the authorized data to replicate.

Count files by extension using powershell

0 comments
I found this beautiful code somewhere on the internet by which we can get the count of all files as per their extensions. I mainly needed this code while migration of on-prim file services to the cloud. So, I thought to share it on my blog to spread it more.

Azure Active Directory Domain Services Capabilities and Limitations.

6 comments

Here are the Capabilities and Limitations of "Azure Active Directory Domain Services" which you need to consider while making a decision for Active Directory in cloud.

Managed service
Azure AD Domain Services domains are managed by Microsoft. You do not have to worry about patching, updates, monitoring, backups, and ensuring availability of your domain. These management tasks are offered as a service by Microsoft Azure for your managed domains.


Secure deployments
The managed domain is securely locked down as per Microsoft’s security best practices for AD deployments. These best practices stem from the AD product team's decades of experience engineering and supporting AD deployments. For do-it-yourself deployments, you need to take specific deployment steps to lock down/secure your deployment.

Cloud Based Enterprise Directory [ Microsoft ]

0 comments

Till now, we have only one way to put fully capable Active Directory in a cloud, is that we have to create an Azure VM in Azure IAAS and configure it as the domain controller. But it would require a different set of ‘cloud credentials’ to login/administer VMs in the cloud and it would be limited to the VM only. To go on next level, you can configure an AD trust relationship with your on-premises AD environment over the VPN/ExpressRoute connection. Then, you can join the virtual machines to your domain and user authentication will happen over either a VPN/ExpressRoute connection to your on-premises directory.

There are only a few benefits in doing this.

1. You extended your Active Directory to Cloud.
2. On-Premises Active Directory will replicate to Azure VM Active directory over the VPN/ExpressRoute connection.
3. You can join you Azure VMs to the domain and managed them.
4. You can use each and every functionality of Active directory in the cloud. Like, Domain Join, Group Policy, LDAP Bind/Read/Write and Kerberos/NTLM Authentication.

DFSR: Limiting the Number of Imported Replicated Folders when using DB cloning

0 comments
Basic : Export a Clone of the DFS Replication Database

Before reading this article, you should read the above attached article to get the basic understanding about "DFS Replication Database Cloning". If you any questions regarding this article, please let me know. But make sure, your doubts should be clear before going further.

Cloning database can cause some problems when you have multiple Replicated Folders on a volume. Lets say, you have volume by the name of "Department" and on it you Replicated Folders like: Finance, Sales and Marketing and you are going to create one more Replicated Folder by the name "HR". Hope this small example make sense about multiple Replicated Folders on a single volume.

Here is the scenario to make you more understand with help of above example. You have three servers in your organisation, Server1, Server2, and Server3. Replicated Folders "Finance", "Sales", "Marketing" and "HR" already available on Server1 and Server2. Even on Server3, "Finance", "Sales" and "Marketing" are also there and in sync with Server1 and Server 2 but you need to add "HR" now on Server3. It means, you will first robocopy entire "HR" folder from Server1/Server2 to Server3 and then import replication database.

As per my first article about Database Cloning, i will export the replication database from Server1 as all replicated folders of the volume are in Normal state. While exporting i don't have much to do and i will end-up with the complete clone of the replication database and a configuration xml file from Server. In short, the replication database which i got after run the command, it belongs to all Replicated folders of the same volume. But i am in process to work only on "HR" folder and i don't want other Replicated Folders to be impacted.

I apologize to make it little lengthy, i just want you to make clear about the situation and scenario. Now here is the solution.







Microsoft removes 260 character limit for NTFS Path in new Windows 10

1 comments
The maximum length for a path (file name and its directory route) — also known as MAX_PATH — has been defined by 260 characters. But with the latest Windows 10, Microsoft is giving users the ability to increase the limit.

Export a Clone of the DFS Replication Database

0 comments
Export a Clone of the DFS Replication Database

Here i am going to explain about DFSR new feature “Database Cloning” which is introduced in Windows Server 2012 R2.

Whenever we add a new member to the replication group for the first time, replace server hardware or recover from loss of corruption of the DFS replication database. It takes long time to do the initial synchronisation which takes long time to complete and then files and folders replication starts. By this feature, we can reduce the initial synchronisation time by up to 99% under ideal circumstances.

“Source Server” from which you will export the DFS Replication Database.
“Destination Server” on which you will import the DFS Replication Database.

Changes to Keychains in macOS Sierra

0 comments

This article is related to macOS Sierra 10.12.1 and 10.12.2. Because i am still working on 10.12.3 beta update to find out all keychain changes.
Recently i upgraded my macbook from El Capitan to Sierra and found that my keychain update script has stopped working. Which really made me too confused about its functionality.
My script had a function in which i mentioned the manual path of the login keychain with the $user variable which was like :
1
\Users\$user\Library\Keychains\login.keychain
But after upgrade to macOS Sierra, it has stopped working because they have change the login.keychain name.
1
\Users\$user\Library\Keychains\login.keychain-db
This change has some issues right now. Like your old script with manual path would not work and it can cause some AD account lockout issues.