Hammer of God Utilities (Terminal Services and other useful utils)
Hammer of God Utilities from Tim Mullen's (Thor@hammerofgod.com) site: UserInfo v1.5 (~ 41k - author: Thor)
UserInfo is a little functiod that retrieves all available information about any know user from any NT/Win2k system that you can hit 139 on. Specifically calling the NetUserGetInfo api call at Level 3, UserInfo returns standard info like SID, Primary group, logon restrictions, etc., but it also dumps special group information, pw expiration info, pw age, smartcard requirements, and lots of other stuff. This guy works as a null user, even if the system has RA set to 1 to specifically deny anonymous enumeration.
UserDump v1.11 (~ 42k - author: Thor)
UserDump is UserInfo with a twist. It combines LookupAccountSID and LookupAccountName with UserInfo's NetGetUserInfo calls, resulting in a SID Walker that can dump every user in a domain in a single command line. It gives you all the information that UserInfo does, but it lets you specify the number of users you want to walk. Pretty cool. Also runs as a null user, even with RA set to 1.
ProbeTS v1.0 (~ 33k - author: Thor)
Seeing Erik Birkholz and Clinton Mugge easily redirect terminal server requests to different ports at Blackhat scared me. If you change the default Terminal Server port from 3389 to something else, it is basically undetectable unless you physically try each port. This means that one of your admin people could easily hide a Terminal Server somewhere on your network that you would have no way of finding... and that ain't good. ProbeTS gives you a leg up. Though it takes a back door approach, ProbeTS will scan a full C-Class for you to determine if terminal services are being offered up regardless of what port is actually being used. There is no magic here... You have to be able to hit the boxes with RPC, and you have to be an authenticated TS user on the target machine. This would typically limit its use to Domain Admins, but it is more than you had to begin with. Don't expect it to be too fast either- What you gain in being able to identify an any-port terminal server, you give up in speed. Specifically, it loops through your C-Class, and asks every IP address for a terminal server handle. If it gets one, it knows it is a TServer. Simple, but effective. TSEnum Beta v0.91 (~ 33k - author: Thor)
This guy goes about things a little differently than ProbeTS does. There is certainly a place for ProbeTS in the LAN, but TSEnum has proven to be a bit more powerful- that is, if I can figure out why I am getting different results when I run it. Feel free to email me with any idiosyncrasies in its operation.
TSEnum (Terminal Server Enum) is actually a lot more than that- it is an EVERYTHING enumeration tool. Again, my goal was to find a good way to quickly scan the network for rouge Terminal Servers ala Erik and Clinton's hiding techniques. When a server/workstation joins the domain, it registers itself with the master browser. Part of this registration includes the server type, which can be retrieved via the NetServerEnum function. This is basically a remote API call that gets the target box to query its master browser for everything that it can see, and asks it to dump it all back to you. Cool stuff. What I need help with is the testing in different environments. I have been able to successfully enumerate all the servers in other domains with no credentials, and without having to do an anonymous net use first... But, sometimes it errors out on me, even when it worked previously. Go figure. So, give it a shot and let me know what you come up with. Thanks!
TransportEnum 
        v1.0 (~ 33k - author: Thor)
        When I was doing research for my RestrictAnonymous stuff, 
        I basically went through lots of different Net API calls to see what I 
        could do as an anonymous user. I was particularly interested in calls 
        that could made as NULL even when RA was set to 1. The NetServerTransport 
        Enum call is one such call that supports and SERVER_TRANSPORT_INFO_0 level 
        structure return in such circumstances. It basically allows you to get 
        the transport names (devices) in use on a box. With NT4, the protocol 
        name usually contains the adapter type as well as the protocol, so it 
        was pretty easy to see stuff like modems, net cards, etc in a dump... 
        i.e. a box running TCP/IP on an Intel card would dump something like: 
        
        Transport: \Device\NetBT_E100B1
        Address: 00a0c9740202 
        This was really useful to enumerate all the transport information, modems 
        included, on a box/domain. However, in Win2k/XP, the transport name has 
        changed to a Unicode character string that contains the device name, and 
        what looks to be a CSID or something as in:
        Transport: \Device\NetBT_Tcpip_{CE081110-126E-4BD1-88B0-2FF8C1D83D10}
        Address: 00c0f06cdf7a
        You get the protocol name still, but it is hard (for me without doing 
        other research) to see if the device is a modem or not without finding 
        out what that CSID is. So, hopefully, someone out there will come across 
        this and have to time to contribute to the tool in regard to mapping out 
        what the CSID means. Please let me know if you find anything interesting. 
        
      
TSGrinder(About Damn Time!)
        TSGrinder is the first production Terminal 
        Server brute force tool, and is now in release 2. The main idea here is that the Administrator 
        account, since it cannot be locked out for local logons, can be brute 
        forced. And having an encrypted channel to the TS logon process sure helps 
        to keep IDS from catching the attempts.  
      TSGringer is a "dictionary" based attack tool, but it does have some 
      interesting features like "l337" conversion, and supports multiple attack 
      windows from a single dictionary file.  It supports multiple password 
      attempts in the same connection, and allows you to specify how many times 
      to try a username/password combination within a particular connection. 
      Note that the tool requires the Microsoft Simulated Terminal Server Client 
      tool, "roboclient," which may be found here:
      
      ftp://ftp.microsoft.com/ResKit/win2000/roboclient.zip
      
      There are still a couple of bugs we are working out- for instance, we've 
      got a problem with using "l337" conversion with more than 2 threads open. 
      There have also been requests to support standard 
      brute-force-via-character-iteration attacks, and we will get to this when 
      we can.  In the meantime, enjoy the tool, and let me know how it 
      works for you. 
      For those interested in the Blackhat presentation Ryan Russell and I made 
      in Vegas, you can find that here:
      
      ttp://www.blackhat.com/presentations/bh-usa-03/bh-us-03-mullen.pdf
Go nuts!
 
SQueaL v1.0
        SQueaL is my new rouge SQL2000 server impersonator 
        written under Linux using DilDog's most excellent TalkNTLM C++ code (the 
        Telnet Server exploit) as a basis. Though the packet structures and NTLM 
        negotiation between an MS client and SQL200 are completely different than 
        the standard NTLM authentication, and most parts of the code had to be 
        completely rewritten for this to work, I must give credit to DilDog for 
        making his code available. DBNETLIB supports NTLM authentication, and 
        as shown in my presentations at Blackhat and Defcon (though my demo hosed 
        up on me! Damn White Russians!) you can 'force' an MS client with DBNETLIB 
        loaded (and guess what, it is on XP by default <bfg> ) to authenticate 
        to you over port 1433. This guy will wait for a connection, negotiate 
        NTLM authentication and parse out the plain-text username and domain, 
        along with the NTLM response hash for your cracking pleasure. At some 
        point, if I don't run out of Vodka, I'll try to duplicate SMBRelay-esk 
        functionality to use the response to authenticate back to the client if 
        139/445 is open. I don't know how to do that yet, so I may be full of 
        crap.
        * 3/7/02 OK- Here it is. I have had mixed results in different environments 
        with the tool. Sometimes it works, sometimes it doesn't. Here is the source, 
        so you can all hack away at it and see how well you do. PLease let me 
        know if you find something I need to know or if you have any ideas. Thanks.
Blaine Kubesh reported on a Security Focus list the fact that Nimba's execution relied upon a named mutex (MUTually EXclusive object) to run. Running a program that creates this named mutex first causes Nimda's load to fail (reportedly- I actually do not have the means to test it). Here it is; It is a simple console program that opens, creates the named mutex, and keeps running until you hit 'q' to close the handle and exit the program (leaving you exposed again). Note that this is ver 0.02, which fixed a problem where I named the handle, but did not actually name the mutex (Doh! Thanks to Jason Anderssen for bringing that to my attention.) Source code included (what little there is.)
URLScan 
        DTS package v0.01 (~ 58k - author: Thor) 
        Microsoft's URLScan utility for IIS is great, 
        but the urlscan.log file is pretty basic-
        URLScan only lets you log to a text file in a simple one-after-another 
        appended manner, and only
        to a single file (no multiple files to break out week/month entries as 
        IIS does).
        
        This makes monitoring the entries in the log file difficult, which is 
        a shame because it is good to 
        see what attempted URLs get filters for incident response. To help with 
        this, I have created a DTS package
        that runs on SQL2000 to automatically do the following:
        
        1) FTP the urlscan.log file to a temp dir on the SQLServer (this way, 
        you don't have to stop IIS).
        2) Parse out the date,time, IP address (if available) and the URL that 
        was filtered and post to
        a temp table.
        3) Select only the entries for the previous day, and post those to the 
        warehouse table. 
        
        The urlscan.log file just keeps getting bigger and bigger, so at some 
        point, you'll want to stop IIS and
        delete that guy. The nice thing about loading it into a temp table first 
        is that you can ensure that only
        the day-by-day entries get posted into the warehouse table. 
        
        For each server you want to pull data 
        from, just add another package and schedule execution appropriately- the 
        current setup is designed for sequential downloads from multiple servers; 
        in other words, you should only do one at a time, and give adequate time 
        between schedules for each to execute- make sure you execute them after 
        midnight so that all the entries for the day will be included. 
        
        If you want to pull multiple server asynchronously, you can customize 
        each package to use a different temp file name and then all at the same 
        time... Whatever you want to do. I am making the assumption that you have 
        some idea how to use and configure DTS packages.
        
        You only need to do 2 things to the package for it to work with your server:
        1) Change the properties of the FTP task to point to your web server, 
        and make sure you select the
        urlscan.log file to get downloaded. The default dir on the server is C:\TEMP. 
        
        2) In the last SQL Task that posts data from the tmpURLScan table to the 
        real URLScan table, change the
        text string from 'SERVER' to the name of your server. 
        Note that this assumes you have a DB named IISLogs; you can change it 
        to whatever you want, but know that you need to check for that in the 
        data pump tasks. 
        
        This way, you end up with a great 
        way to sort, retrieve, group and report on data in the log files.
        
        A note about FTP server setup:
        On the web server that you want to pull data from, just create a new FTP 
        site that points to the directory containing the urlscan.log file. For 
        security purposes, it should be read only, and limited to your internal 
        IP addresses. I created a new user specifically for this purpose that 
        only has read permissions to the URLSCAN.LOG file, with specific deny 
        permissions on the rest of the files- you can never be too safe- the real 
        reason I deny access on the other files is that would not want someone 
        internally to be able to sniff the creds, FTP in, and look at my URLSCAN.INI 
        file to see exactly what I had configured. 
        Good luck! 
 SFC
SFC

<< Home