Hello all, We know how important it is as a team to keep improving our moderation, and since late August, the development team has been building the tools needed by the moderation team to improve its effectiveness. Moderation hasn’t been a topic that we often spoke about publicly in the past, but we think this is a good time to improve our transparency and share some extra insight into our moderation & live operations. One of the key elements of any competitive multiplayer game is moderation, but this is not an easy challenge for any company, including Hypixel. Being part of the Minecraft platform offers an extra unique challenge that doesn’t allow us to utilize some of the solutions that owning accounts and the client would offer. Although this doesn’t stop us from striving to improve the community and moderation efforts, it does mean that we often have to spend a lot of extra resources to do it. As a team, we have always agreed to strive for a balance between strictness and forgiveness. We have a lot of young players in the Hypixel community who are often influenced or don’t understand that the action they may be doing is wrong. We strongly believe that we should educate everyone by providing a fair warning or a second chance, though we recognize this has not been a popular decision in the eyes of some of our community. So what have we been doing to improve the fairness of competition for players? A lot of our focus has been on prevention and detection systems such as Watchdog. Since Watchdog’s release in January 2016 to the time of writing this post, it has issued over 2,629,859 actions against players. I honestly wouldn’t like to imagine what it would be like without Watchdog. However, we understand that this not an all-in-one solution as cheat makers will always look for newer methods to exploit in the Minecraft client which means we are in a continuous tug of war between us and cheat makers. To help us with this issue, we have been working on broader cheater checks to look out for new exploit methods. These checks will allow us to alert our moderation and development teams of the method and provide us more time to work towards prevention and issue punishments. This has already been producing great results with over 12,000 detections in the past week. We have also been spending design and development time on improving our manual detection and punishment system, which we have nicknamed “The Forgiveness System,” and it has been fully rolled live as of this week. This improves the following: Automatically detects the recent behavior of a player to make a decision on if the system should offer forgiveness or strictness to a player’s punishment when breaking rules. Developer Note: [ This will mean players who repeatedly break the rules will now receive stricter punishments than before, as these players action should not be allowed to affect others players. Cheating has very much been the focus of this improvement thus the punishments for it have been made very strict] Punishments will remove access to use chat or the server based on what type of rule you're breaking. Chat related rule breaking will remove your ability to talk to other players while gameplay breaking will remove your ability to play on the network. Developer Note: [No rule breaking is acceptable. A player found being rude or abusive to other players via chat may also face a server ban if the behavior repeats] Improve effectiveness and fairness of the moderator team by allowing them to focus on the decision making, rather than the punishment action. Punishment messages are now more detailed and easier for non-english players. Better monitoring and communication between moderation systems and the administration team, allowing us to monitor and improve all of our moderation actions. Allows our development team to integrate new tools easier and faster. Appeals will operate as normal. If in the odd case that you think your ban/mute was false, another moderator will verify that the evidence provided is valid for the ban or mute that has been issued. What does this mean for me as Player? Well, if you follow our network rules which you can READ HERE, you may not notice any change but you should be aware that we are being more strict on rule breakers that affect your gameplay. For those who choose to actively break the rules, this is a fair warning that it is not acceptable and you may find yourself unable to take part in our community if you choose to do so. Have you considered doing this? We have seen a lot of suggestions and feedback from the community such as the idea of IP Banning, and we wanted to let you know that yes, we do consider all options for dealing with moderation systems. Actually, we have had a system similar to IP Banning live for the last two years and have been continuously improving it since then. We understand there are lots of ideas, but there is also a lot to consider before implementing a new system. It is very important to us that legitimate players are not negatively affected by any new moderation system. What’s next? We feel comfortable that our moderation tools are heading in the right direction. We want to focus on making it easier for players to report rule-breaking ingame and offer some self-moderation tools to players. So over the next month, we are working on a new reporting feature that will better integrate chat reports and watchdog reports into one system, giving our moderation team an easier way to action player reports. More information about this will come out closer to release. We understand that everyone makes mistakes and that is why we want to keep a balance between forgiveness and strictness. We hope that this post and our new moderation will show players that we are taking repetitive seriously. I encourage that before doing something, you think about the effect it could have on others players. If you think your action could negatively affect someone else, do not do it.