Introduction

MongoDB installs itself with a base configuration that is not optimal for many organiztions. In this quick look at MongoDB, we are going to cover the steps necessary to configure it to use a different location for storing its document tree.

Overview

The popular document database MongoDB is easily installed on an CentOS 7 server. Changing the configuration of the server can lead to conflicts with SELinux and will keep the MongoDB service from starting. In this post, we will learn how to change the MongoDB configuration and get it up and running with a dedicated document tree directory.

Getting the Latest and Greatest MongoDB

The first step in installing MongoDB in a proper fashion is to create a Yum repository definition file. Yum repository definition files are located at /etc/yum.repos.d in a CentOS 7 installation.  In this directory create the file MongoDB-org.repo.


cd /etc/yum.repo.d
vim MongoDB-org.repo

Enter the following into the file and save it.

We can now pull the latest stable version of MongoBD from the repository with the following command:

yum -y install mongodb-org mongodb-org-server

Once the installation is complete, we can start the MongoDB daemon by typing:

systemctl start mongod

By typing ps -ef | grep mongo we can verify the daemon is indeed running.

Note the location of the configuration file is /etc/mongod.conf. We will be editing this file.

Editing the MongoDB Configuration File

The standard location for the MongoDB directory tree is located at /var/lib/mongo. This can cause a major headache when the document tree grows too large for the file system. By creating a new document tree location on another drive, mounted partition or simply a dedicated directory in another location, we can make managing MongoDB easier in the future.

The MongoDB installation also has two other settings that most users would want to change. In this example, we will use a dedicated directory on a new drive mounted on a partition named /data.

So first we make our edits to /etc/mongod.conf.

vim /etc/mongod.conf

The edits we want to make occur at the top of the file. If we wish, we can change the location of the log file, database document tree directory, server port number or assign the server an IP number. We will leave the logfile location at the default setting, but move our document tree to the /data/mongodb directory.

We uncomment the line assigning a port number to ensure the server will always use the default setting. By commenting out the IP number setting, the server will now respond to requests from any server and not just localhost.

Save the file and restart the MongoDB daemon.
systemctl restart mongod

Unfortunately, we now receive the following error:

[root@localhost]# systemctl restart mongod
Job for mongod.service failed because the control process exited with error code. See "systemctl status mongod.service" and "journalctl -xe" for details.

This is because we need to modify system file indicators to change ownership of the file and to place the new /data/mongodb directory into the MongoDB context maintained by SELinux.

Adding the New File Tree for MongoDB to the SELinux Context

So first we change to the directory above our newly created mongodb directory and then execute ls -al.

The mongodb directory is owned by root and it is required to be owned by the Mongo Daemon user mongod.

chown mongod.mongod mongodb

Next we check the system file attributes for SELinux by listing them for the default MongoDB file tree directory /var/lib/mongo. We can do this by using the -Z option for the list command.

To set the file attributes for the mongodb directory to match the default directory attribues, we use the chcon (change context) command. The file attributes are as follows:

  • User: system_u
  • Role: object_r
  • Type: mongod_var_lib_t
  • Level: s0

Each of these parameters can be set individually, or all at once using the chcon command. We can also use another file as a reference, so we will use the previously defined /var/lib/mongo directory:

Then we view the results with ls -Z.

We can now restart MongoDB.

The restart will take a bit of time as the MongoDB daemon writes new journal, lock and temp directories and files into the new data directory location.

Conclusion

In this article we have installed the latest version of MongoDB document database on our CentOS 7 server, created a dedicated location for the document file tree and configured the daemon and systems to work with SELinux. In future articles, we expect to take a look at creating document tree structures and various indexing techniques.

Introduction

One of the most common vulnerabilities in website security is inadequate protection against enumeration attacks.  In this article we’re going to look at several easily implemented strategies to protect your website or application from bad actors.

Overview

Enumeration attacks take many forms, but have one goal: finding a user or list of user names that have accounts on the web site.  In many cases, what seems to be good security is painfully easy for an experienced hacker to exploit.  In this article, we’ll learn how to implement some methods to protect websites from some of the most used attacks.  These methods are intended to be part of a larger comprehensive website security strategy.

Spring Security’s WebSecurityConfigurerAdapter

By implementing Spring Security’s WebSecurityConfigurerAdapter and properly setting its configuration, we protect our entire site from attacks. All authentication requests go directly to Spring Security. The default configuration is fairly robust and will get us started. The WebSecurityConfigurerAdapter will allow us to protect website directories and pages, define responses to failed login attempts and restrict movement within the website and manages logouts, sessions and cookies too.

While Spring Security provides a great deal of protection through its default implementation, it can be improved upon. In order to ramp up our protection against enumeration attacks, we will modify the SpringSecurityFilterChain, implement alternate authentication methods and customize login responses with the AuthenticationSuccessHandler() and AuthenticationFailureHandler() interfaces.

Simple Rate Limiting

A first step in stopping enumeration attacks using Spring Security is to limit the number of failed login attempts in a short time period from an individual IP number or user. The Spring Security framework provides us with an AuthenticationFailureBadCredentialsEvent that we can use to track this type of activity.

By implementing an AuthenticationFailureEventListener, we can catch failed login attempts, track them and then implement a lock-out strategy after a threshold is reached. We then modify our AuthenticationFailureHandler to display the Account Login Blocked error page. We can block the page for any amount of time we wish such as a period of 24 hours.

Permanent Blocking of Repeat Offenders

A comprehensive protection strategy could include collection and analysis of failed login attempts and development of a blacklist of IP numbers that will be permanently blocked from accessing the site. By inserting a custom filter into the SpringSecurityFilterChain, it becomes part of the authentication process and we can easily block all login requests from blacklisted IP numbers. The filter might access a local resource, in memory database, or other low-latency data store to obtain the blacklist.

Implement Two-Factor Authentication

A favorite of financial institutions is a two, three or four factor authentication process. Along with the usual login page, the user has to answer security questions, provide additional information such as a pin, personal information or identify an image. The odds against successfully breaching security increases exponentially with each factor applied. The basic idea is a verification process which follows the well known principle of “something the user knows and something the user has”.

–Assign the PRE_AUTH_USER role–

When a user successfully provides valid credentials such as User ID and Password, they are assigned the PRE_AUTH_USER role by Spring Security in our custom UserDetailsService.

–Create a second level login form–

This page will have a field for entering an additional parameter such as a pin or numeric code. A popular solution is to use a soft token or a time based one time password (TOTP) verification algorithm such as Google Authenticator.

–Configure Spring Security–

Configure Spring Security to provide a url and security restriction for the second-level form. After initial login the user is directed to the “/code” directory where they are shown the code entry screen. After the code is validated, they are assigned the USER role and then redirected to the home directory.

The SpringSecurityConfig.java appears as follows:

@Override
protected void configure(HttpSecurity http) throws Exception {
http.authorizeRequests()
.antMatchers("/signup", "/static/**").permitAll()
.antMatchers("/code").hasRole("PRE_AUTH_USER")
.antMatchers("/home").hasRole("USER")
.anyRequest().authenticated();

http.formLogin()
.loginPage("/login")
.permitAll()
// always use the default success url despite if a protected page had been previously visited
.defaultSuccessUrl("/code", true)
.and()
.logout()
.permitAll();
}

Other forms of two factor authentication can also be used such as QR_codes, security questions and pass phrases.

Allow Login from Accepted Locations

This is the reverse of the IP blocking strategy above. In this case we will do a hidden type of two factor authentication to verify the login credentials. When a user registers with the website, we capture the IP number used and then check to see in what country it is located. On a subsequent visit, after the user authenticates the location of the IP number from the request is checked against the stored country location. Access can be denied or further login credentials can be requested.

Implement an OpenID solution

Open ID allows users to be authenticated by using a third party service, eliminating the need for webmasters to provide their own login systems. Using an OpenID provider allows us to select different approaches ranging from the common (such as passwords) to the novel (such as smart cards or biometrics). The identity provider will also be able to provide you data on the login activity on your site enabling you to any necessary analysis. All of the big players are OpenID providers (Google, Microsoft, Facebook, Yahoo), but there are also other providers such as Keycloak, MyOPenId and VeriSign.

Conclusion

By using one or more of the above solutions, we can increase our site’s security, block enumeration attacks from hackers and still keep a reasonable amount of usability for our users. However, no one security solution is entirely effective in protecting a website or application and adopters of solutions should research the pros and cons of each technique and address any shortcomings.

Introduction

Enforcing strong passwords on a website is a common and necessary requirement. By using freely available open source solutions, we can implement a lightweight, effective and reusable solution to this problem.

Overview

In this article, we will be implementing a strong and effective password requirement policy on a website. The same solution can be used for registration and password reset functions. We will be using javascript and JQuery to build a responsive and attractive UI that provides visual feedback to the user while simultaneously verifying the password entered meets our strength requirements.

Creating a Registration Page

Our registration page is a regular JSP page and we will be using <div> tags and special <form> labels to implement our password checking functionality and feedback. Our JQuery will manipulate the values of these tags to make all the magic happen.

Importing the JQuery library into a webpage is simple and requires the use of the <script> html tag. The following import tags are placed within the <head>tag on our JSP page. The first tag references an online version of JQuery. We can also download the library and put it into our resources directory so that we are not dependent on an external source. The other scripts are used to provide the custom methods to implement our password strength checker UI. The jquery_validate.js script allows us to create a validator class that applies rules to our form field entries giving us error messages in red under the field with an error.

Our JSP page is coded as follows:

The PSWD_INFO.JS Script

Our method makes us of jquery-3.2.1.js and its form jquery.validator logic. The pswd_info.js handles all the UI updates and determines whether the password data meets our criteria.

When the user starts typing in the password box, the interface is updated by the java script and provides an easy to follow guide along with color-coded visual cues.
Strong Password User Interface


Successful password entry
Strong Password User Interface


Matching Password and Field Validation Errors
Strong Password User Interface

The pswd_info.js script is as follows:

Supporting CSS

The last bit of code needed to make this work is some style sheet entries. These entries are class notation and provide the colors and X and checkmark images during the validation process.

Conclusion

In this article we have used the JQuery library, CSS and the javascript capabilities of the browser to create a lightweight client-side based interface that requires Strong Password Entry. Of course, the entered information should also be verified on the server side once the form is submitted. Our solution eliminates multiple server requests during the process resulting in a smoother and enhanced user experience. The code for this article can be found in the KodeKutter GitHub Project

Introduction

Your latest project has a requirement for a JSON config file or need to use it as a data input.  So you type it all out or use a JSON generator to get you a JSON file structure.  Then you write the code to import it and then turn to building the object.  Only to find you left out a quote somewhere or the data structure you envisioned is not the best thing once you started writing the code.  Well, sometimes its just easier to do things backwards.

Overview

Retrofit an existing app or speed the development of a small JSON component by coding the imported data object first and then generate the needed JSON from it. By using the Jackson JSON library, you will ensure that the generated JSON file conforms to the proper syntax and it will show you the JSON representation of various JAVA objects, including HashMaps and lists of objects.

Create a Starter Project

We will be using the Eclipse based Spring Tool Suite(STS) and its New Project Wizard to speed this project along.  The New Spring Starter Project wizard in the STS makes a call to http://start.spring.io which will then return the configuration to create a Maven based Spring Boot project in STS.  All that is required is to fill out the first dialog of the wizard, click next and then finish.  We will not be adding any other dependencies from the Spring Boot configurator.

Spring Boot Wizard

Modify Pom.xml

In order to be able to export our object as a JSON formatted file, we need to add two dependencies to the pom.xml file,  FasterXml’s Jackson Core library and Commons.io.  The dependencies section of our pom.xml will look like this:

Creating the Application

At this point, we can construct the project to work in two different ways.  We could make a conventional project that is compiled and then executed to create the JSON file, or we could write the JSON creation code as a test case.  When the project is compiled, the test case is automatically executed and the JSON file created.  This eliminates a step in the process.  We will use the second option and generate the JSON code using a test case.

The JSONGeneratorApplication class

Nothing special here, just be sure to use the @SpringBootApplication annotation.

The POJOTestDto class

This is the object that we will use in this example.  In future use, we replace this file in the project with the class or classes we wish to generate JSON for.  If there is more than one class in this package location, the generated JSON file will contain representations of all the classes.  There will NOT be one JSON file for each class.

The JSonGeneratorApplicationTest class

In this class we instantiate the object we want to model in JSON, fill it with data and then generate the JSON file using the Jackson library.  We make sure to add the @RunWith(SpringRunner.class), @SpringBootTest and @Test annotations which will trigger the JSON creation process when we compile the project.

The Generated JSON file

Next we use the pom.xml to compile the project which then generates the JSON file.  We either use the submenu in the STS (click on pom.xml, right click->Run As->Maven Clean, Run As-Maven Test or type mvn compile in the project directory containing the pom.xml file.  The apache.maven.surefire plugin is needed in the pom.xml file if the test is run on the command line.

Conclusion

In this example we have seen how we can make life easier for ourselves by rolling our own JSON generator.  This project can also be used as a debugging tool to check and see if you have properly formed JSON for your complicated objects.  We could also cut and paste part of the code directly into another project to generate JSON files.  The code for this tutorial can be found in the KodeKutter GitHub Project.

Introduction

One of the most common vulnerabilities in web site security is protection against enumeration attacks.  In this article we’re going to look at this vulnerability, how it happens and things we can do to protect us and our users against attacks.

Overview

Enumeration attacks take many forms, but have one goal: finding a user or list of user names that have accounts on the web site.  In many cases, the responses programmed to various login scenarios give away as much information as they protect.  In this article, we’ll learn how to implement some methods to protect websites from some of the most used attacks.  These methods are intended to be part of a larger comprehensive website security strategy.

Fixing the Single Factor Login Form

By simply entering various user names or email addresses into a website login page, an attacker can find out if a user with that name exists on the website. In some cases, the response actually says so!

Error message 1:  The user name you entered does not exist.
Error message 2: The password you entered does not match that user name.

Knowing that the user name exists on the system, the attacker can now proceed to a brute-force or personal information password attack.  The first way to improve the security of a login form is to only provide a generic message to login failures.  An example is “Sorry, the username or password entered does not exist.”  Now the attacker does not know if the user name exists on the system.

For stopping brute force password attacks with Spring Security, see our  article at Prevent Brute Force Authentication Attempts with Spring Security.

Fixing the Registration and Password Reset Forms

Responses on these two pages can give away information just like the login form.

Confirmation response: A confirmation email has been sent to user@your-account.com.
Error message 1:  Sorry, that email already exists, please select another email address.
Error message 2: Original password incorrect.

Don’t fall into the trap that showing the user email address on the screen is a good way to assure the user their request was handled.  A better way to provide this support is with Spring Security and a two step authentication process.  (See below).

These are not the only ways enumeration attacks can leak information about customer accounts to bad actors.  Any page on your website that is open to the internet is a potential target and it only takes one careless mistake on a shopping cart or information query page to expose your site to risks.

Response Time Indication

Check the timing of logins responses too.  For instance, a slow hashing algorithm can indicate an account exists by the several hundred milliseconds delay from doing a password search when the username exists in the system.  So while your response message may not reveal any information to the attacker, a repeatable slow response time on a single account ID may say otherwise.

Browser Cookie Errors

Some sites use a two-level login approach for additional security.  The username is required to start the login sequence. After the username is entered, the user is the prompted for security questions.  These security questions are designed to display regardless of whether the username entered is in the database, attempting to prevent user enumeration.  It’s transparent to the user, but a cookie value is set and gives the attacker a way to determine if the username is valid.  When a cookie is set one way when a username is valid and not set (or set differently) when a username is invalid, its an easy tip-off to the bad actors.

User Exists: Set “User” cookie = 138298432 (some random 9 digit value)
User Does Not Exist: Set “User” cookie = 0

Solution:  When a user does not exist set the cookie value to a fake value that appears to be a valid id number or do not set a cookie at all unless the application requires it.

Registration and Password Forms Redux

Web sites require a unique username or email address when registering. During the registration process, the application alerts them when a username or email already exists, prompting them to select another. There really is no way around this. For Example:

User Exists: Sorry, that email already exists, please select another email address
User Does Not Exist: Congratulations! Your new email address has been set!

So how do we protect ourselves when faced with this situation?  The best way is for us to implement two factor authentication.  If the username is an email address, we send a one-time, expiring link for email address changes.  When a user wants to change their email address we will do the following steps:

  • Send a link to the existing email address that expires upon clicking and after a specified amount of time.  That link allows them to change their email address.  Additionally, we can require a specially encoded token.  Not only will an attacker need a link and an email address, they will also need the token.
  • During a new registration if the user selects an email that already exists within the system, do not alert the user that the email already exists, but rather display a message similar to “Thank you, a notification email has been sent to that email address.”  An attacker now knows we are watching.
  • Send an email to the new email address advising the user that there was an attempt to register their email address with the application, but the action could not be completed since the email is already registered. For additional security, send a suspicious activity alert back to your web security team.
  • If the user selects an email address that is not already registered, send the standard email address change link.
  • If the username is not an email address, but rather something the user creates on their own, a CAPTCHA or similar technology can be used to limit the speed in which the usernames are enumerated, but not eliminate the attack entirely.

While these methods are not fool-proof, by implementing them we make it harder for bad actors to successfully penetrate our security.  Usually, they will move onto an easier target which is what we want to have happen.

Introduction

Thorough testing of a java application takes code…a lot of code.  Instead of continually making more files with the same boilerplate code over and over again, we can construct one unit test with a configuration file that handles entire groups of similar pojo’s or models.

Overview

By using Spring Reflection Test Utilities and a .json configuration file, we can boil the unit tests down to one test case and one config.  To add a new pojo to be be tested, one would simply add another entry into the .json configuration. No new code required!

1. Maven Dependencies

Since this is a Maven-based project, the project wizard will add the required dependencies to the pom.xml:

If you want to get the newest version of the libraries above, look for them on Maven Central.

2. Create a POJO

The POJO can be any simple object such as a DTO or plain pojo with Getters and Setters.

Our test DTO as most of the data typs we use:

3. Abstract POJO Tester Class

The abstract pojo tester class uses reflection and information from the .json configuration file to instantiate the object, query and invoke its getter and setter methods and then use JUnit assertions to tell us if the test passed or failed. The code is a bit dense but it includes all the basic data types(String,Boolean,Date,Long,Integer, Collection, Set). We can add more to these as our needs require.

The real wizardry of the tester is the use of the Class.jar methods to take the pojo name from the configuration and instantiate it. Then based on Spring naming conventions, it will take the 1st key of each entry as the field name, use reflection and determine the data type to use for the data in the second key. This works with any class we wish to test simply by adding it to the configuration file.

Here is the really nifty abstract pojo tester class:

4. JSON based Configuration File

The configuration file contains the name and package of the pojo we want to test and the data used to test it. By adding new sections to the file, we can continually add pojo’s to our application and not have to write additional testing code.

Here is the json configuration to test our pojo:

5. JUnit Test Class

To use the pojo tester, we will set it up in a standard JUnit test case:

6. Conclusion

In this tutorial we looked at the Spring’s ReflectionTestUtils package and the how it lets us write reusable unit test code. By adding a JSON configuration to the code, we are able to extend the code indefinitely in the future saving hours of coding time. The code for this tutorial can be found in the KodeKutter GitHub Project.

Introduction

In this article we’re going to take a look at reactive programming with Spring’s WebFlux framework and how it can be used provide real-time streaming updates.   WebFlux’s reactive programming support provides everything needed to create server based web applications and a restful client interface.

 1. Overview

The heart of the system is the Reactor library which is an implementation of the Reactive Streams specification.  To accomplish our goal we will implement the Publisher interface of the library.  We have a choice of using Flux(multiple streams) or Mono(one time) for our application.  The example planned is a recurring interval web page update at a one second https://rbais.com/techstop/wp-admin/plugins.phpinterval.

In keeping with basic Spring architecture, we will use an annotation based controller, a service class and provide a client class to view the results.

2. Maven Dependencies

A maven based starter spring boot project is needed with ReactiveWeb and Websocket dependencies loaded.  Use the Spring Initializr Website or the Spring Boot Project Wizard in the Eclipse based Spring Tool Suite to get started.  Our project will be named SimpleWebFluxDemo.

Since this is a Maven-based project, the project wizard will add the required dependencies to the pom.xml:

If you want to get the newest version of the libraries above, look for them on Maven Central.

3. Create the Base Application Class

First, we start with a simple base Application class:

4. Create the Annotated RestController

Then we create an annotated RestController to provide the endpoint for the services.  This is a simple controller with a root “/” mapping .  Our @GetMapping annotation specifies that the controller will produce a response of MediaType.TEXT_HTML_VALUE and that it will be a Flux response providing 1..N streams.

Here is a simple annotated rest controller:

5. Create the Annotated Service Class

To create the string data for the Flux Demo, we provide an annotated service class that generates the data at a one second interval and includes a date timestamp so we can verify it is occurring at one second intervals.

In the service class, interval is created as a Flux stream that will generate an event and set the duration at one second.  By subscribing to the interval stream, we can trigger an action to take place with each event.  In this case the generateNewMessage() method is called. And then we create another Flux stream that provides the latest newMessage that has been generated by that call.

The Flux.zip method joins the two together in the order listed and then we map the response as being only the second of the two values(Tuple2::getT2), in this case the newMessage string returned by the generateNewMessage() method.  A tuple is simply an immutable collection of objects used in the Flux.zip method.

Our annotated service class:

6. Create the Web Client Class

The last thing we need to add is a client class:

7. Unit Test for the RestController

The unit test uses the @WebFluxTest annotation to easily select the SWFluxDemoController for testing and StepVerifier for checking three intervals of messages.  Both are contained in the imported ReactorTest module.  Our unit test:

8. Running the Example

To run the demo application, open a command prompt window, change to the directory with the project pom.xml and type: mvn spring-boot:run.  You should see the embedded Tomcat server started in the console output along with our SimpleWebFluxDemoApplication.

In your browser, type the url http://localhost:8080.  The one second event messages should start scrolling down your screen.  To stop the service from running, simply click the stop button on your browser toolbar.

Flux streaming output in browser window:

Updated response at: Wed Mar 07 16:09:55 EST 2018
Updated response at: Wed Mar 07 16:09:56 EST 2018
Updated response at: Wed Mar 07 16:09:57 EST 2018
Updated response at: Wed Mar 07 16:09:58 EST 2018

9. Conclusion

This article has provided a simple example of the capabilities of the Reactor library in Spring 5’s WebFlux implementation.  In future articles, we will explore more complex implementations, how to implement Java 8 style lambda’s to create a true functional type model and provide some practical use cases.  The code for this example is available in the GitHub Project (more…)