Not every forum administrator wants their forum to be public. Some want to have a members-only forum. In fact, it’s not unusual to want the forum to be completely hidden or wholly inaccessible by the public. Some forum administrators realize it’s important not only to keep humans out, but search engines as well.
The good news is that phpBB can keep your forum private, although there are some steps you might want to take outside of phpBB. The bad news is that the procedures for doing so are pretty obscure. Let’s look at some common ways of limiting access.
Keeping everyone out using your web server’s security system
Pros: about as secure as you can get
Cons: shared passwords are often used, ugly interface, and it works separately from the forum
The most effective way to keep everyone out but specified users is to use a security mechanism that is built into your web server. The technique originated with the Apache web server. Not all web servers use Apache, but most do. IIS is Microsoft’s web server, if you are using Windows hosting. nginx (pronounced “Engine X”) is another web server gaining in popularity that is slowly replacing Apache.
With this approach, the first step is to determine what web server software you are running. This site makes it easy.
The idea is to use the web server to challenge the user trying to get into the forum’s folder by requiring the user to successfully provide some credentials, usually a username and password. Typically you get an ugly black and white screen with these fields and a submit button. So this approach is not pretty, but it is highly secure.
If you want to go with this approach, first look at your web host control panel. Control panels like cPanel often have a feature that lets you password protect folders, in this case your phpBB root folder. Here are cPanel’s instructions. Failing that you can do this yourself.
You can make it easy and use a shared username and/or password or create one for each member of the forum. Note that this happens outside of the forum, so any usernames and passwords used with this approach will probably not be the username and password used to login to the forum. You will have to pass the username and password to use to the user, perhaps using email. This approach simply allows access to the forum so a second step is needed: you must also login to the forum.
This approach not only keeps out humans, but also search engines.
Although not covered here, there are even more secure ways to limit access if you limit access to specific IP addresses. A search engine query will provide instructions if this approach interests you. Since most IP addresses are generated dynamically, this approach usually requires allowing a range of IP addresses and is somewhat fragile.
Stopping search engines from indexing your site with a robots.txt file
Pros: Simple and probably 99% effective
Cons: Malicious search engines can choose to ignore your policy
You can instruct search engines not to search your site. While you can provide instructions, this approach doesn’t keep malicious search engine agents from indexing your site anyhow. Essentially you create a robots.txt file in a plain editor like Notepad and upload it to your forum’s root folder. Its contents should look like this:
User-agent: * Disallow: /
Disallowing search engines using phpBB
Pros: Effectively stops search engines that phpBB knows about, which are most of them. With the permissions properly set these search engines cannot index your content because the permissions won’t allow it.
Cons: Limited to the 46 search engines that phpBB handles by default
- ACP > Permissions > Permission roles > Forum roles
- Click on the green wheel on the Bot Access row
- Go to the bottom of the page and select the Actions tab
- Click on the No column header link which easily makes all these permissions no. Then Submit.
By changing the properties of the Bots role it will affect all existing bots plus any additional bots you create manually later on.
If you want to add bots manually, you can do it this way: ACP > System > General tasks > Spiders/Robots. Where would you discover new robots that might be hitting your site? You would need to periodically review your web server access log.
The phpBB group periodically adds new robots so when you update or upgrade these new robots will appear and will inherit privileges for the bots role.
You can certainly add a robots.txt file disallowing access to your forum root folder and use these procedures too.
Disallowing guest access to forums
Pros: Removes guest read privileges
Cons: A little complex to set up and message to guests is misleading
- ACP > Forums > Forum based permissions > Group forum permissions
- Select the Guests usergroup and press Submit
- Select the forums that you don’t want guests to read or access. For all, check All Forums. Then press Submit.
- If you want guests to neither read the forum nor see its name, for each forum change Read Only Access to No Access then the press the Apply All Permission button at the bottom of the page. Note: if all forums were changed then at this point guests accessing the index will see a “No forums” message. This is misleading because the forums are there, you just have to be registered, logged in and have appropriate permissions to see them.
- If you want guests to see the forum name but not be able to see or read any topics, first complete step 4. Then for each forum click on Advanced permissions, select the Actions tab and select Yes to Can see forum. When applied to all applicable forums, press the Apply All Permission button at the bottom of the page.
If security is a concern, consider also using HTTPS to encrypt all traffic going to and from your forum. More is on this post.