The Internet is an amazing place where we can expand our knowledge – or – just look pictures of animals with funny captions, but have you ever wondered to yourself who owns that domain, who took the time to build that amazing website, see if a business is legit or maybe you just want to learn a new nerdy skill?
A domain name can be registered by anyone so long as its available and not registered to anyone else, and can be bought at anytime through hundreds, thousands or maybe millions of companies known as domain registrars. The job of a domain registrar is to take money and convert it into domain registrations as they are essentially the middle men between the domain registries (the top dogs of the domain world, the owners of the bit after the dot) and ourselves.
When a domain is registered, regardless of the registrar used, contact details will always need to be provided. These details form what’s known as the legal registrant and can be either a company or an individual who will legally own the name for however long it has been registered for.
That’s great but what next? Well here comes the juicy bit! All that information is kept in a global database known as the WHOIS database (pronounced “who is”) which is free to browse and will give an insight into any domain registration.
The following guide will show you step by step how to query the WHOIS database for free with no special software required. To keep things simple I will be using a website that I created which has a built in WHOIS tool.
- First things first we need to head to the WHOIS tool, click onto the following link or type it into your address bar directly: http://www.nerdtools.co.uk/whois/
- Once the website loads you’ll see a box where it asks you to enter a domain name, enter the domain which you would like to query and press Enter or the “Let’s do this! >” button
- After a few seconds you’ll be redirected to a new page that shows the domain details in a similar format to one shown below:
- As you can see from the screenshot above a lot of information is returned, so much that it doesn’t all fit on screen without scrolling but once you read through you will easily see who owns the domain, when it was registered, when it expires and other useful information
- In the example above you can see no “Registrant’s address” is returned, this is because its a .UK domain and Nominet (the registry behind all .UK domains) allow the address to be hidden for any non-trading individuals, but with domains such as .COM, .NET, .ORG the information will always be available
- Depending on the domain name things may look a little different to the one in the example
- Any changes to a domains details can take up to 24 hours to show so things may not always be accurate
- There are strict terms that need to be followed when it comes to using the information returned from a lookup and these can be found usually be found at the bottom – It’s not shown in the screenshot as it was so big, to see them click here and scroll down
- Sometimes registrars offer a privacy package that will hide the registrants contact information and replace it with the registrars instead, if you see a domain like this that’s trading as a business stay well away as it could be up to no good!
Whenever I deploy a new server I always ensure that any flaws which I’ve picked up from my few years of server experience are fixed, leaving the new server as secure as can be and ready for use.
Below are a few tips for keeping your server as secure as can be:
- Have a secure root password – Use something random and at least 8 characters long
- Use non-default ports – Change the default port for services commonly targeted by bots or attackers such as SSH
- Check your logs – Look for authentication failures and put the related IPs in a block or reject rule using iptables
- Process users – Make sure processes have their own users and aren’t ran as root
More tips will be added once I remember them!
The simple one line command below will enable the EPEL repository on CentOS 7
rpm -Uvh http://dl.fedoraproject.org/pub/epel/7/x86_64/e/epel-release-7-2.noarch.rpm
Once ran you will see confirmation that it has been installed successfully, that’s it!
- You can find out more about the EPEL repository here
- If you don’t already have a server, I’d strongly recommend starting with DigitalOcean
A recent project of mine called Coop Cam uses several live video streams served by an Icecast server at different mount points which works great, but I found there was no real solution to simply display how many viewers were actually watching the live streams.
I put together a basic PHP code that reads the Icecast XML stats file and retrieves the current overall viewers (or listeners as its officially known) of all available mount points.
// get the stats xml file //
$output = file_get_contents('http://admin:[email protected]:8000/admin/stats');
// explode to make the magic happen //
$listeners = explode('',$output);
$listeners = explode('',$listeners);
// output to the world //
echo "Currently <b>$listeners</b> people are watching the live stream!";
Once you have amended the admin password, server name and port the code above will then connect to your server and read the /admin/stats XML file. From here it will literally pick out the content shown between the <listeners></listeners> tags and that then becomes the $listeners variable, simply place this wherever you want to display the amount of current viewers.
- This code may or may not work depending on if your hosting provider allows the file_get_contents function – In my case I use my own dedicated servers and it works without issue, if you have any problems I’m sure I can sort something for you!
- You can show the amount of sources, file connections and so on by amending the code to reflect the correct tags – A full list of tags can be seen by visiting the youricecastservername.com:8000/admin/stats page
- You can find a live working example of this script here or actually see it in place here
- Finally, you can download the script by clicking here
For a while now I’ve used a cheap SunLuxy H.264 DVR as the heart of the CoopCam project and initially couldn’t get a direct link to the camera stream so had to screen captured the bog standard web interface using VLC and break the feed down into separate streams but recently after a fair bit of trial and error I discovered a much easier solution!
I had researched on and off for months, went through masses of trial and error with various software and ultimately found no solution but after being inspired again I headed to the DVR’s web interface to start from scratch. I stumbled across source code in a file called /js/view2.js that constructs an RTMP:// address to show live camera feeds through the web interfaces flash player – See snippet of code below:
dvr_viewer.ConnectRTMP(index, "rtmp://" + location.host, "ch" + index + "_" + (dvr_type=="main"?"0":"1") + ".264");
After removing the jargon the link came out as rtmp://dvraddress:port/ch#_#.264 with the first number being the channel you want to connect to (starting at 0) and the second being the stream (substream being 1 and main being 0)
I headed to VLC player, selected Open Network Stream and entered the following:
Broken down you can see my DVR is on the local network as 192.168.0.100 at port 81 and that I wanted to view channel 1’s main stream, low and behold after a few seconds the camera started to play!
- To convert the stream to something more useful you could use rtmpdump and ffmpeg on Linux systems – I’ll write another guide about that shortly
- If you do something wrong and overload the DVR then you’ll hear a beep as the box reboots
- If this works for you please comment your DVR make and model
Virtualmin is constantly being developed and gaining ever useful features, and for a while now has featured two-factor authentication which is great, although what happens if you get locked out of your system? As long as you have SSH or console access then you can follow the steps below to easily get back in.
Disabling two-factor authentication for a single user
- Get root SSH or console access
- Edit the file /etc/webmin/miniserv.users, comment out the current line for the user then create a fresh copy above it
- Remove any mention of “totp” and the long string of characters near the end and save, for example your file should now look like the following:
- Restart Webmin and log back in normally
Disabling two-factor authentication entirely
- Get root SSH or console access
- Edit the file /etc/webmin/miniserv.conf and find the line “twofactor_provider=totp” and replace with “twofactor_provider=” and save
- Edit the /etc/webmin/miniserv.users as mentioned above
- Restart Webmin and log back in normally
- I’ve had success with this on Webmin 1.760 running on CentOS 7.0
Any good web host will secure the contents of website directories which don’t have an index page by not allowing the files or folders to be listed, instead you’ll get a 403 error page saying access is forbidden. Whilst this is good in practice, sometimes you might actually need to list the contents – and its simple to enable on an Apache web server – add one line to your .htaccess file and you’re done!
How it’s done
- If you have access you can edit your web server configuration and make it global
I used to be a customer of popular cloud backup service Livedrive. The upload and download speeds were nothing to shout about and one annoyance was having to pay extra to add a NAS drive to your account, but there is a workaround!
All you need to do is add a symbolic link to your NAS drive from your computer. Think of a symbolic link as a fancy shortcut, the only difference being it masks the destination instead of taking you straight there – you’ll see what I mean when you read on.
Imagine you have a Windows computer with your NAS drive with the root of the drive already mapped to Z:, you have a folder on your NAS called MyFiles and would be able to browse to Z:\MyFiles to see whatever is stored there. Next imagine we have a folder called C:\Backup which is already uploading to your Livedrive account, using the following command we will make C:\Backup\MyFiles lead to your NAS and in turn be included with your Livedrive backup.
mklink /d "C:\Backup\MyFiles" "Z:\MyFiles"
For me, this worked absolutely fine and I had a couple of TB uploaded without ever being caught out. I’ve since jumped ship to Amazon Drive, whilst it is more expensive per year I’ve got it running from multiple computers and the upload and download speed always tops out my connection, so I can’t complain!
- Use the above guide at your own risk – I won’t be held liable if anything happens to your Livedrive account, files or anything else because of this!
- This doesn’t work with Dropbox or Google Drive – sorry
- You only need to run the command once, after that the link will be remembered
- To remove the link just delete it as you would any other file or folder
Recently I created a script that uses the Bad Bots database where a list of bad bots is retrieved and then parsed into iptable firewall rules to prevent further access from known bad hosts.
More information can be found at GitHub here.
Imagine this… you have two decent network attach storage boxes which regularly backup one to the other using a built in Disk Backup tool – Brilliant huh, sounds almost like a nerdy dream! Now imagine part way through a backup you get a power cut or you just trip over the power cable ripping the plug out the wall… not to worry, things will pick up where they left off… unless those decent boxes are Buffalo LinkStations!
I first discovered this flaw a few weeks back when one of my nightly backups seemed to be taking longer than usual. I gave the box about a day or so to try and fix itself but it still kept saying that the disk backup was in progress and in the admin interface and I was unable to cancel or remove the backup, so it was pretty much stuck as you can see below:
I headed to the official Buffalo support website which seemed to have a fix for this common problem – See for yourself below:
Okay so you have to restore the box to factory defaults… no thanks! I can only assume that because the HS-DHGL is one of their older discontinued products they just can’t be bothered to make a firmware update as it’s not worth their time or effort, but the other option is to use SSH to edit a file which will force the backup to complete.
The following guide will assume you have already enabled SSH and are logged in ready to go, if you haven’t yet enabled SSH see this post here.
- First of all we need to locate the backup configuration file and this depends on the job number specified on the admin interface, in my case it was number 1 so we need to type in the following command to open the file in a text editor:
- You will now see the configuration file open, hit I (for indigo) on your keyboard to allow inserting of new text and change the line status=running to status=done
- Hit the Escape key and then type :wq to save your changes and quit
- Head back to the admin interface to the Disk Backup section and you’ll now see the backup showing as complete as seen below:
- That’s it – The backup is unstuck, and we haven’t had to restore anything to factory defaults!
- This has been tried and tested on the following models/firmware: HS-DHGL/v2.1
- Finally, if you could let me know if you encounter any problems or can confirm if this works for other models I’d be grateful