JHipster, is it worth it?

Reading Time: 7 minutes

JHipster is an open-source platform to generate, develop and deploy Spring Boot + Angular / React / Vue web applications. And with over 15 000 stars on Github, it is the most popular code generation framework for Spring Boot. But is it worth the hype or is the generated code too difficult to maintain and not production-ready?

How does it work?

The first thing to note is that JHipster is not a separate framework by itself. It uses yeoman and .jdl files in order to generate code in Spring Boot for backend and Angular or React or Vue for frontend. And after the initial generation of the project, you have the option to use the generated code without ever running JHipster commands again or to use JHipster in order to incrementally grow the projects and develop new features.

What exactly is JDL?

JDL is a JHipster-specific domain language where you can describe all your applications, deployments, entities and their relationships in a single file (or more than one) with a user-friendly syntax.

You can use our online JDL-Studio or one of the JHipster IDE plugins/extensions, which support working with JDL files.

Example of simple JDL file for Blog application:

entity Blog {
  name String required minlength(3)
  handle String required minlength(2)
}

entity Post {
  title String required
  content TextBlob required
  date Instant required
}

entity Tag {
  name String required minlength(2)
}

relationship ManyToOne {
  Blog{user(login)} to User
  Post{blog(name)} to Blog
}

relationship ManyToMany {
  Post{tag(name)} to Tag{entry}
}

paginate Post, Tag with infinite-scroll

Which technologies are used?

On the backend we have the following technologies:

  • Spring Boot as the primary backend framework
  • Maven or Gradle for configuration
  • Spring Security as a Security framework
  • Spring MVC REST + Jackson for REST communication
  • Spring Data JPA + Bean Validation for Object Relational Mapping
  • Liquibase for Database updates
  • MySQL, PostgreSQL, Oracle, MsSQL or MariaDB as SQL databases
  • MongoDB, Counchbase or Cassandra as NoSQL databases
  • Thymleaf as a templating engine
  • Optional Elasticsearch support if you want to have search capabilities on top of your database
  • Optional Spring WebSockets for Web Socket communication
  • Optional Kafka support as a publish-subscribe messaging system

On the frontend side these technologies are used:

  • Angular or React or Vue as a primary frontend framework
  • Responsive Web Design with Twitter Bootstrap
  • HTML5 Boilerplate compatible with modern browsers
  • Full internationalization support
  • Installation of new JavaScript libraries with NPM
  • Build, optimization and live reload with Webpack
  • Testing with Jest and Protractor
  • Optional Sass support for CSS design

How to get started?

  1. Pre-requirements: JavaGit and Node.js.
  2. Install JHipster npm install -g generator-jhipster
  3. Create a new directory and go into it mkdir myApp && cd myApp
  4. Run JHipster and follow instructions on the screen jhipster
  5. Model your entities with JDL Studio and download the resulting jhipster-jdl.jh file
  6. Generate your entities with jhipster import-jdl jhipster-jdl.jh
  7. Run ./mvnw to start generated backend
  8. Run npm start to start generated frontend with live reload support

How does the generated code and application look like?

In case you only want to see a sample generated application without starting the whole framework you can check this official Github repo for the latest up-to-date sample code: https://github.com/jhipster/jhipster-sample-app.

Following are some screen from my up and running JHipster application:

Welcome screen jhipster homepageThis is the initial screen when you open your JHipster app

Create a user screenjhipster user create screenWith this form you can create a new user in the app

View all users screenjhipster user management screenIn this screen you have the option to manage all your existing users

Monitoring of your JHipster application screenjhipster monitoring screenMonitoring of JVM metrics, as well as HTTP requests statistics

What are the pros and cons

The important thing to remember is that JHipster is not a “magic bullet” that will solve all your problems and is not an optimal solution for all the new projects. As a good software engineer, you will have to weigh in the pros and cons of this platform and decide when it makes sense to use and when it’s better to go with a different approach. Having used JHipster for production projects these are some of the pros and cons that I’ve experienced:

Pros

  • Easy bootstrap of a new project with a lot of technologies preconfigured
  • JHipster almost always follows best practices and latest trends in backend and frontend development
  • Login, register, management of users and monitoring comes out-of-the-box
  • Wizard for generating your project, only the technologies that you select are included in the project
  • After defining your own JDL file, all of the models, repository, service and controllers classes for your entities are generated, together with integration tests. This is saving a lot of time in the begging of the project when you want to get to feature development as soon as possible

Cons

  • If you are not familiar with technologies that are being used in the generated project it can be overwhelming and it’s easy to get lost into this mix of lots of different technologies
  • Using JHipster after the initial project is not a smooth experience. Classes and Liquibase scripts are being overwritten and you have to be very careful with changing the initial JDL model. Or you can decide to continue without using JHipster after the initial generation of projects
  • REST responses that are returned from endpoints will not always correspond to business requirements, very often you will have to manually modify your initial JHipster REST responses
  • Not all of the options that are available are at the same level, some technologies that JHipster is using and configuring are more polished than the others. Especially true if you decide to use community modules

What kind of projects are a good fit?

Having said all of this, it’s important to understand that there are projects which can benefit a lot from JHipster and projects that are better without using this platform.

In my experience, a good candidate is a greenfield project where it’s expected to deliver a lot of features fast. JHipster will help a lot to be productive from day one and to cut on the boilerplate code that you need to write. So you will be able, to begin with, feature development really fast. This works well with new projects with tight deadlines, proof of concepts, internal projects, hackathons, and startups.

On the other hand, a not so ideal situation is if you have an already started and up and running project, there is not much a JHipster can do in this case. Or another case would if the application has a lot of specific business logic and its not a simple CRUD application, for example, an AI project, a chatbot or a legacy ecosystem where these new technologies are not suitable or supported.

JHipster, is it worth it?

There is only one sure way to decide if JHipster is worth it for your next project or not and that is to try it out yourself and play around with the different features and configuration that JHipster offers.

At best, you will find a new framework for your next project and save a lot of effort next time you have to start a project. At worst, you will get to know the latest trends in both backend and frontend and learn some of the best practices from a very large community.

Reactive Spring with WebFlux and SQL Databases

Reading Time: 6 minutes

Since SpringBoot 2 the Spring WebFlux was introduced so we can create reactive web applications. This was great and it was working fine with NoSql databases but when it came to relational databases this was an issue. The JDBC database operations are blocking by nature and this will stop you to create a totally non-blocking application. But in order to have an asynchronous and non-blocking application, we will need to cover every layer of the application. The hero that solved this was the R2DBC – Reactive Relational Database Connectivity that gives a possibility to make none-blocking calls to Relational Databases.

The combination of WebFlux and R2DBC is enough to cover every layer in our application that we are going to build. As a relational database, we are going to use H2. So on to the coding!

Go to the spring initializr page from where we are going to build our application and select the following configuration:

  • Group: com.north47 (or your package name)
  • Artifact: spring-r2dbc
  • Dependencies: Spring Reactive Web, Spring Data R2DBC [Experimental], H2 Database, Lombok

(You won’t be able to see the Lombok on this picture, but there it is! If for some reason the Lombok is causing you issues you might need to install a plugin. To do this in Intellij go to File -> Settings -> Plugins search for Lombok, install it and restart your IDE. If you can’t manage to do it just go the old way remove the annotations @Data, @AllArgsConstructor, @NoArgsConstructor in the Book.java class and just create your own setters, getters and constructors).

Now click on Generate, unzip the application and open it via your IDE.

Let’s first create a SQL script that will create our table. Go to src -> main -> resources and right-click on it and select New -> File. Name the file: schema.sql and enter there the following code:

CREATE TABLE BOOK (
ID INTEGER IDENTITY PRIMARY KEY ,
NAME VARCHAR(255) NOT NULL,
AUTHOR VARCHAR (255) NOT NULL
);

This will create a table with name ‘Book’ and the following columns: ID, NAME and AUTHOR.

We will create an additional script that will put us some data in our database. Repeat the following procedure from previous and this time give a name to the file: data.sql and add the following code:

INSERT INTO BOOK (ID,NAME,AUTHOR) VALUES (1,'Angels and Demons','Dan Brown');
INSERT INTO BOOK (ID,NAME, AUTHOR) VALUES (2,'The Matarese Circle', 'Robert Ludlum');
INSERT INTO BOOK (ID,NAME,AUTHOR) VALUES (3,'Name of the Rose', 'Umberto Eco');

This will put some data into our database.

In resources delete the application.properties file and let’s create a new file where we are going to add the following:

logging:
  level:
    org.springframework.data.r2dbc: DEBUG
spring:
  r2dbc:
    url: r2dbc:h2:mem:///test?options=DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
    name: sa
    password:


Now that we have defined the r2dbc URL and enabled DEBUG logging level for r2dbc let’s go to create our java classes.

Create a new package domain under the ‘com.north47.springr2dbc’ and create a new class Book. This will be our database model:

package com.north47.springr2dbc.domain;

import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.springframework.data.annotation.Id;
import org.springframework.data.relational.core.mapping.Column;
import org.springframework.data.relational.core.mapping.Table;

@Table("book")
@Data
@AllArgsConstructor
@NoArgsConstructor
public class Book {

    @Id
    private Long id;

    @Column(value = "name")
    private String name;

    @Column(value = "author")
    private String author;

}

Now to create our repository first create a new package named ‘repository’ under ‘com.north47.springrdbc’. In there create an interface named BookRepository. This interface will extend the R2dbRepository:

package com.north47.springr2dbc.repository;

import com.north47.springr2dbc.domain.Book;
import org.springframework.data.r2dbc.repository.R2dbcRepository;

public interface BookRepository extends R2dbcRepository<Book, Long> {
}

As you may notice we are not extending the JpaRepository as usual. The R2dbcRepository will provide us with methods that can work with objects like Flux, Mono etc…

After this, we will create endpoints from where we can access the previously inserted data or create new, modify it or delete it.

Create a new package ‘resource’ under the ‘com.north47.springr2dbc’ package and in there we will create our BookResource:

package com.north47.springr2dbc.resource;

import com.north47.springr2dbc.domain.Book;
import com.north47.springr2dbc.repository.BookRepository;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;

@RestController
@RequestMapping(value = "/books")
public class BookResource {

    private final BookRepository bookRepository;

    public BookResource(BookRepository bookRepository) {
        this.bookRepository = bookRepository;
    }

    @GetMapping(produces = MediaType.TEXT_EVENT_STREAM_VALUE)
    public Flux<Book> getAllBooks() {
        return bookRepository.findAll();
    }

    @GetMapping(value = "/{id}")
    public Mono<Book> findById(@PathVariable Long id) {
        return bookRepository.findById(id);
    }

    @PostMapping(consumes = MediaType.APPLICATION_JSON_VALUE)
    public Mono<Book> save(@RequestBody Book book) {
        return bookRepository.save(book);
    }

    @DeleteMapping(value = "/{id}")
    public Mono<Void> delete(@PathVariable Long id) {
        return bookRepository.deleteById(id);
    }
}

And there we have endpoints from where we can access our data and modify it.

On to the postman so we can test our application, but of course, first, let’s start it. When you run the application you can see in the console that your server is started:

Netty started on port(s): 8080

Also since we enabled DEBUG log level you should be able to see al the SQL queries that are executed from the scripts that we wrote previously.

In postman set a GET method and the url: localhost:8080/books. In the Headers add key: ‘Content-Type’, value:’application-json’.

Press that send button and there it is you will get the data:

data:{"id":1,"name":"Angels and Demons","author":"Dan Brown"}

data:{"id":2,"name":"The Matarese Circle","author":"Robert Ludlum"}

data:{"id":3,"name":"Name of the Rose","author":"Umberto Eco"}

You can test also the other endpoints, for example, getting a book by id just by changing the URL to localhost:8080/books/1. The result will be:

{
    "id": 1,
    "name": "Angels and Demons",
    "author": "Dan Brown"
}

Now you can test the other endpoints by creating a new Book by sending a POST request to the localhost:8080/books or delete a book by sending a DELETE to localhost:8080/books/{id}.

Here you can find the whole code:

Spring-R2DBC

Hope you enjoyed it!

A simple way of using Micrometer, Prometheus and Grafana (Spring Boot 2)

Reading Time: 7 minutes

When we run any java application, we are running JVM. That JVM uses resources like memory, processor etc. Same happens when we run any spring application too; it runs and uses our hardware resources. Monitoring and measuring these parameters is crucial when we are in production or when we like to test the performance of our application. With spring, it is easy. We should just include spring actuator and it will give us access to almost all measurements that we need like:

"jvm.memory.max",
"jvm.threads.states",
"jvm.gc.memory.promoted",
"jvm.memory.used",
"jvm.gc.max.data.size",
"jvm.gc.pause",
"jvm.memory.committed",
"system.cpu.count",
"logback.events",
…

To set up spring actuator add the following dependency in our project:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-actuator</artifactId>
</dependency>

and on the following endpoint:

<host/context-path>/actuator

we will have basic links to additional features of the application and monitoring:

{
    "_links": {
        "self": {
            "href": "http://localhost:8080/actuator",
            "templated": false
        },
        "health": {
            "href": "http://localhost:8080/actuator/health",
            "templated": false
        },
        "health-path": {
            "href": "http://localhost:8080/actuator/health/{*path}",
            "templated": true
        },
        "info": {
            "href": "http://localhost:8080/actuator/info",
            "templated": false
        }
    }
}

If these basic information are not enough we can extend them with adding the following parameter in the application configuration file:

management.endpoints.web.exposure.include=*

By following any of these links, we will access the details. For our use it will be http://localhost:8080/actuator/metrics from which we are going to access to the metrics of our application.

Now we have almost everything what we need to monitor our application how it performs. Requests, JVM memory, cache, threads etc…

Micrometer

However, if we have some more logic in our code and we need more precise metrics for our application and want to get metrics for our code we will need some other way to get them. Spring Boot 2 Actuator enrich all this already exiting metrics with the micrometer data provider.

Micrometer is a dimensional-first metrics collection facade whose aim is to allow you to time, count, and gauge your code with a vendor-neutral API.

Moreover, a micrometer is a vendor-neutral data provider and exposes application metrics to other external monitoring systems like Prometheus, AWS Cloudwatch etc…

Micrometer gives a set of Meter primitives and plus including Timer, Counter, Gauge, DistributionSummary, LongTaskTimer, FunctionCounter, FunctionTimer, and TimeGauge. Here we should be aware that every different meter type has a different number of time-series metrics. The gauge has a single metric, but the timer has a count of timed events and a total time of all events timed.

If we write something like this in our code:

List<Integer> gaugeList = registry.gauge("dummy.gauge.list", Collections.emptyList(), someList, List::size);
        List<Integer> gaugeCollectionsSizeList = registry.gaugeCollectionSize("dummy.size.list", Tags.empty(), someList);
        Map<Integer, Integer> gaugeMapSize = registry.gaugeMapSize("dummy.gauge.map", Tags.empty(), someMap);

registry.timer("dummy.timer", Tags.empty()).record(() -> {
            slowDummyMethod();
        });

We will have three parameters for the Timer (dummy_timer_seconds_count, dummy_timer_seconds_max. dummy_timer_seconds_sum) and dummy_gauge_list, dummy_gauge_map, dummy_gauge_list.

All this data can be used from many monitoring systems like Netflix Atlas, CloudWatch,  Datadog, Ganglia etc… Here in our case, we will use Prometheus.

Prometheus

Including Prometheus in our project is easy with adding maven dependency:

<dependency>
    <groupId>io.micrometer</groupId>
    <artifactId>micrometer-registry-prometheus</artifactId>
</dependency>

This will create the new endpoint in the actuator http://localhost:8080/baeldung/actuator/prometheus. If we access this URL we will get the metrics from the micrometer.

To see this data in some graphic UI we will have to start Prometheus server. We can do that directly by downloading the Prometheus server and run it.

https://prometheus.io/download/

The configuration is in the prometheus.yml file.

Basic parameters that we should set up here are:

global:
  scrape_interval:     10s # Scrape interval to every 10 seconds. Default value is every 1 minute.

and

scrape_configs:
  - job_name: 'spring_micrometer'

    metrics_path: '/micromexample/actuator/prometheus' # Path to the prometheus end point in our application. “micromexample” is the context and “actuator/prometheus” is default path for prometheus in our application
    static_configs:
    - targets: ['localhost:8080'] # host where our application is deployed

Or another way to have Prometheus server we can run docker image which will contain Prometheus in it. We can do that with the following command:

docker run -d -p 9090:9090 -v <yours-prometheus-config-file.yml>:/etc/prometheus/prometheus.yml prom/prometheus

“9090” – the port where our Prometheus will listen, this value is the default port

<yours-prometheus-config-file.yml> – our configuration file for Prometheus

“prom/prometheus” – docker image with Prometheus

After we run spring boot application with Prometheus included and we run Prometheus server we should be able to see the metrics in some basic view from Prometheus

http://localhost:9090/graph

this is what we should get from our service:

For this graph, we wrote the following code (to have something to be sure that everything works)

registry.timer("dummy.timer ", Tags.empty()).record(() -> {
    slowDummyMethod();
});

Grafana

If we want, reach graphical UI, easy to browse through the metrics data, dashboard editing, cloud monitoring compatibility then it will be a good idea to use Grafana.

Setting up Grafana is similar to Prometheus, we will need a Grafana server.

Again, we can download and install it locally. Like this, we will have service in our OS:

https://grafana.com/get

Or run docker image with Grafana in it:

docker run -d -p 3000:3000 grafana/grafana

“3000” – port for grafana

“grafana/grafana” – docker image with grafana

Default user and password are admin/admin. On the first login, you will be asked to add a new password.

After we log in we should add source, wherefrom Grafana will read the metrics. Go to the following left menu: Configuration -> Data Sources, chose the “Data Sources” tab and add new data source “Add data source”.

Since we decided to go with Prometheus we will select Prometheus source. In the new page (Configuration), because we did not set any authentication or anything else in Prometheus – everything is default, we need just to set HTTP -> URL field. For our case, it will be “http://localhost:9090”. If everything is ok by clicking “Save and test” we should get a green bar that Grafana is connected to Prometheus and we can access the metrics from it.

Let’s see our first metrics from the timer that we added in our application. For this one we will create our own new dashboard:

Chose “Add Query” and in the new window add following key in the “Metrics”: “dummy_timer_seconds_count”. This will add one metric in our graph.

In the same graph, we can add the second one from the timer “dummy_timer_seconds_max”. With this, we will have both metrics in the same graph.

There are other parameters that you can set, but for basic setup default values are fine.

With this, we have set up everything we need for monitoring our application. Next is to add more graphs for metrics that we want to monitor.

Securing your microservices with OAuth 2.0. Building Authorization and Resource server

Reading Time: 8 minutes

We live in a world of microservices. They give us an easy opportunity to scale our application. But as we scale our application it becomes more and more vulnerable. We need to think of a way of how to protect our services and how to keep the wrong people from accessing protected resources. One way to do that is by enabling user authorization and authentication. With authorization and authentication, we need a way to manage credentials, check the access of the requester and make sure people are doing what they suppose to.

When we speak about Spring (Cloud) Security, we are talking about Service authorization powered by OAuth 2.0. This is how it exactly works:

 

The actors in this OAuth 2.0 scenario that we are going to discuss are:

  • Resource Owner – Entity that grants access to a resource, usually you!
  • Resource Server – Server hosting the protected resource
  • Client – App making protected resource requests on behalf of a resource owner
  • Authorization server – server issuing access tokens to clients

The client will ask the resource owner to authorize itself. When the resource owner will provide an authorization grant with the client will send the request to the authorization server. The authorization server replies by sending an access token to the client. Now that the client has access token it will put it in the header and ask the resource server for the protected resource. And finally, the client will get the protected data.

Now that everything is clear about how the general OAuth 2.0 flow is working, let’s get our hands dirty and start writing our resource and authorization server!

Building OAuth2.0 Authorization server

Let’s start by creating our authorization server using the Spring Initializr. Create a project with the following configuration:

  • Project: Maven Project
  • Artefact: auth-server
  • Dependencies: Spring Web, Cloud Security, Cloud OAuth2

Download the project, copy it into your workspace and open it via your IDE. Go to your main class and add the @EnableAuthorizationServer annotation.

@SpringBootApplication
@EnableAuthorizationServer
public class AuthServerApplication {

    public static void main(String[] args) {
        SpringApplication.run(AuthServerApplication.class, args);
    }

}

Go to the application.properties file and make the following modification:

  • Change the server port to 8083
  • Set the context path to be “/api/auth”
  • Set the client id to “north47”
  • Set the client secret to “north47secret”
  • Enable all authorized grant types
  • Set the client scope to read and write
server.port=8083

server.servlet.context-path=/api/auth

security.oauth2.client.client-id=north47
security.oauth2.client.client-secret=north47secret
security.oauth2.client.authorized-grant-types=authorization,password,refresh_token,password,client_credentials
security.oauth2.client.scope=read,write

The client id is a public identifier for applications. The way that we used it is not a good practice for the production environment. It is usually a 32-character hex string so it won’t be so easy guessable.

Let’s add some users into our application. We are going to use in-memory users and we will achieve that by creating a new class ServiceConfig. Create a package called “config” with the following path: com.north47.authserver.config and in there create the above-mentioned class:

@Configuration
public class ServiceConfig extends GlobalAuthenticationConfigurerAdapter {

    @Override
    public void init(AuthenticationManagerBuilder auth) throws Exception {
        auth.inMemoryAuthentication()
                .withUser("filip")
                .password(passwordEncoder().encode("1234"))
                .roles("ADMIN");
    }

    @Bean
    public BCryptPasswordEncoder passwordEncoder() {
        return new BCryptPasswordEncoder();
    }
}

With this we are defining one user with username: ‘filip’ and password: ‘1234’ with a role ADMIN. We are defining that BCryptPasswordEncoder bean so we can encode our password.

In order to authenticate the users that will arrive from another service we are going to add another class called UserResource into the newly created package resource (com.north47.autserver.resource):

@RestController
public class UserResource {

    @RequestMapping("/user")
    public Principal user(Principal user) {
        return user;
    }
}

When the users from other services will try to send a token for validation the user will also be validated with this method.

And that’s it! Now we have our authorization server! The authorization server is providing some default endpoints which we are going to see when we will be testing the resource server.

Building Resource Server

Now let’s build our resource server where we are going to keep our secure data. We will do that with the help of the Spring Initializr. Create a project with the following configuration:

  • Project: Maven Project
  • Artefact: resource-server
  • Dependencies: Spring Web, Cloud Security, Cloud OAuth2

Download the project and copy it in your workspace. First, we are going to create our entity called Train. Create a new package called domain into com.north47.resourceserver and create the class there.

public class Train {

    private int trainId;
    private boolean express;
    private int numOfSeats;

    public Train(int trainId, boolean express, int numOfSeats) {
        this.trainId = trainId;
        this.express = express;
        this.numOfSeats = numOfSeats;
    }

   public int getTrainId() {
        return trainId;
    }

    public void setTrainId(int trainId) {
        this.trainId = trainId;
    }

    public boolean isExpress() {
        return express;
    }

    public void setExpress(boolean express) {
        this.express = express;
    }

    public int getNumOfSeats() {
        return numOfSeats;
    }

    public void setNumOfSeats(int numOfSeats) {
        this.numOfSeats = numOfSeats;
    }

}

Let’s create one resource that will expose an endpoint from where we can get the protected data. Create a new package called resource and there create a class TrainResource. We will have one method only that will expose an endpoint behind we can get the protected data.

@RestController
@RequestMapping("/train")
public class TrainResource {


    @GetMapping
    public List<Train> getTrainData() {

        return Arrays.asList(new Train(1, true, 100),
                new Train(2, false, 80),
                new Train(3, true, 90));
    }
}

Let’s start the application and send a GET request to http://localhost:8082/api/services/train. You will be asked to enter a username and password. The username is user and the password you can see from the console where the application was started. By entering this credentials will give the protected data.

Let’s change the application now to be a resource server by going to the main class ResourceServerApplication and adding the annotation @EnableResourceServer.

@SpringBootApplication
@EnableResourceServer
public class ResourceServerApplication {

    public static void main(String[] args) {
        SpringApplication.run(ResourceServerApplication.class, args);
    }

}

Go to the application properties file and do the following changes:

server.port=8082
server.servlet.context-path=/api/services
security.oauth2.resource.user-info-uri=http://localhost:8083/api/auth/user 

What we have done here is:

  • Changed our server port to 8082
  • Set context path: /api/services
  • Gave user info URI where the user will be validated when he will try to pass a token

Now if you try to get the protected data by sending a GET request to http://localhost:8082/api/services/train the server will return to you a message that you are unauthorized and that full authentication is required. That means that without a token you won’t be able to access the resource.

So that means that we need a fresh new token in order to get the data. We will ask the authorization server to give us a token for the user that we previously created. Our client in this scenario will be the postman. The authorization server that we previously created is exposing some endpoints out of the box. To ask the authorization server for a fresh new token send a POST request to the following URL: localhost:8083/api/auth/oauth/token.

As it was said previously that postman in this scenario is the client that is accessing the resource, it will need to send the client credentials to the authorization server. Those are the client id and the client secret. Go to the authorization tab and add as a username the client id (north47) and the password will be the client secret (north47secret). On the picture below is presented how to set the request:

What is left is to say the username and password of the user. Open the body tab and select x-www-form-urlencoded and add the following values:

  • key: ‘grant_type’, value: ‘password’
  • key: ‘ client_id’, value: ‘north47’
  • key: ‘ username’, value: ‘filip’
  • key: ‘password’, value ‘1234’

Press send and you will get a response with the access_token:

{
    "access_token": "ae27c519-b3da-4da8-bacd-2ffc98450b18",
    "token_type": "bearer",
    "refresh_token": "d97c9d2d-31e7-456d-baa2-c2526fc71a5a",
    "expires_in": 43199,
    "scope": "read write"
}

Now that we have the access token we can call our protected resource by inserting the token into the header of the request. Open postman again and send a GET request to localhost:8082/api/services/train. Open the header tab and here is the place where we will insert the access token. For a key add “Authorization” and for value add “Bearer ae27c519-b3da-4da8-bacd-2ffc98450b18”.

 

And there it is! You have authorized itself and got a new token which allowed you to get the protected data.

You can find the projects in our repository:

And that’s it! Hope you enjoyed it!

JMeter

Reading Time: 4 minutes

Today, we are gonna take a look at JMeter. You can embed it in your application as a library and make an external integration testing solution. You don’t have to use it for load testing, it could simply send onej request, check the return status code, check the return value and move on. There is an argument that JMeter may be overkill for that, but it provides an easy way to verify the return, allows you to set it up using JMeter desktop app and then you can move into testing latency under load.

First, we need to create a test file that will be put later in our spring boot application. The steps for creating the .jmx file are as follows:

1 – Open the JMeter window by clicking on /home/apache-jmeter-5.1.1/bin/jmeter.bat. The next step you want to do with every JMeter Test Plan is to add a thread group element. Set the “Loop Count” parameter equal to 1, as shown below:

2 – The next step is to create a while controller. The purpose the while controller is to repeat a given set of actions until the condition is broken. While is a basic programming concept to run actions where the number of iterations to run are unknown or varying.

3 – Create an HTTP request as shown in the figure below:

4 – Afterwards, we are gonna create a CSV Data Set Config. This step refers to the CSV file for which the partner users will be collected and replaced as in the httprequest.

5- After running our test, we want to see the results e.g. which calls have been done, and which ones have failed. This is done through Listeners, a recording mechanism that shows results, including logging and debugging.

The View Results Tree is the most common Listener.

Right-click – Add->Listener->View Results Tree

6 – At the end, it should be something like the figure below:

Now click ‘Save’. Your test will be saved as a .jmx file. To run the test, click the green arrow on top. After the test completes running, you can view the results on the Listener as in the figure below. In this example, you can see the tests were successful because they’re green. On the right, you can see more detailed results, like load time, connect time, errors, the request data, the response data, etc. You can also save the results if you want to.

JMeter can also be used for Maven testing through a plugin and work quite nicely with variables and prerequisites etc. Integrating Performance Testing in your projects has many benefits:

  • It provides a constant check of the performances of the application
  • Secures continuous delivery in production
  • Allows early detection of performance problems or performance regressions
  • Automates the process means less manual work, allowing your team to focus on more valuable tasks like performance analysis and optimisation

First of all, you need to add your plugin to the project. So go to Maven project directory (jmeter-testproject in this case) and edit the pom.xml file. Here you must add the plugin. You can find the basic configuration here. You just need to copy the configuration text and paste it in your pom.xml file.

Finally, you have a pom.xml file that looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<plugin>
   <groupId>com.lazerycode.jmeter</groupId>
   <artifactId>jmeter-maven-plugin</artifactId>
   <version>2.9.0</version>
   <executions>
      <execution>
         <id>jmeter-tests</id>
         <phase>test</phase>
         <goals>
            <goal>jmeter</goal>
         </goals>
      </execution>
      <execution>
         <id>jmeter-check-results</id>
         <phase>test</phase>
         <goals>
            <goal>results</goal>
         </goals>
      </execution>
   </executions>
   <configuration>
      <testFilesDirectory>src/test/resources/jmeter</testFilesDirectory>
      <ignoreResultFailures>false</ignoreResultFailures>
   </configuration>
</plugin>

In the code above we will be running 2 goals:

  • jmeter in phase Test: This goal will run the load test and generate the HTML report
  • results in phase Test: This goal runs some verification on error rate and fails the build

Testing Spring Boot application with examples

Reading Time: 7 minutes

Why bother writing tests is already a well-discussed topic in software engineering. I won’t go into much details on this topic, but I will mention some of the main benefits.

In my opinion, testing your software is the only way to achieve confidence that your code will work on the production environment. Another huge benefit is that it allows you to refactor your code without fear that you will break some existing features.

Risk of bugs vs the number of tests

In the Java world, one of the most popular frameworks is Spring Boot, and part of the popularity and success of Spring Boot is exactly the topic of this blog – testing. Spring Boot and Spring framework offer out-of-the-box support for testing and new features are being added constantly. When Spring framework appeared on the Java scene in 2005, one of the reasons for its success was exactly this, ease of writing and maintaining tests, as opposed to JavaEE where writing integration requires additional libraries like Arquillian.

In the following, I will go over different types of tests in Spring Boot, when to use them and give a short example.

Testing pyramid

We can roughly group all automated tests into 3 groups:

  • Unit tests
  • Service (integration) tests
  • UI (end to end) tests

As we go from the bottom of the pyramid to the top tests become slower for execution, so if we measure execution times, unit tests will be in orders of few milliseconds, service in hundreds milliseconds and UI will execute in seconds. If we measure the scope of tests, unit as the name suggest test small units of code. Service will test the whole service or slice of that service that involve multiple units and UI has the largest scope and they are testing multiple different services. In the following sections, I will go over some examples and how we can unit test and service test spring boot application. UI testing can be achieved using external tools like Selenium and Protractor, but they are not related to Spring Boot.

Unit testing

In my opinion, unit tests make the most sense when you have some kind of validators, algorithms or other code that has lots of different inputs and outputs and executing integration tests would take too much time. Let’s see how we can test validator with Spring Boot.

Validator class for emails

public class Validators {

    private static final String EMAIL_REGEX = "(?:[a-z0-9!#$%&amp;'*+/=?^_`{|}~-]+(?:\\.[a-z0-9!#$%&amp;'*+/=?^_`{|}~-]+)*|\"(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21\\x23-\\x5b\\x5d-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])*\")@(?:(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?|\\[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21-\\x5a\\x53-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])+)\\])";

    public static boolean isEmailValid(String email) {
        return email.matches(EMAIL_REGEX);
    }
}

Unit tests for email validator with Spring Boot

@RunWith(MockitoJUnitRunner.class)
public class ValidatorsTest {
    @Test
    public void testEmailValidator() {
        assertThat(isEmailValid("valid@north-47.com")).isTrue();
        assertThat(isEmailValid("invalidnorth-47.com")).isFalse();
        assertThat(isEmailValid("invalid@47")).isFalse();
    }
}

MockitoJUnitRunner is used for using Mockito in tests and detection of @Mock annotations. In this case, we are testing email validator as a separate unit from the rest of the application. MockitoJUnitRunner is not a Spring Boot annotation, so this way of writing unit tests can be done in other frameworks as well.

Integration testing of the whole application

If we have to choose only one type of test in Spring Boot, then using the integration test to test the whole application makes the most sense. We will not be able to cover all the scenarios, but we will significantly reduce the risk. In order to do integration testing, we need to start the application context. In Spring Boot 2, this is achieved with following annotations @RunWith(SpringRunner.class) and @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT. This will start the application on some random port and we can inject beans into our tests and do REST calls on application endpoints.

In the following is an example code for testing book endpoints. For making rest API calls we are using Spring TestRestTemplate which is more suitable for integration tests compared to RestTemplate.

@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class SpringBootTestingApplicationTests {

    @Autowired
    private TestRestTemplate restTemplate;

    @Autowired
    private BookRepository bookRepository;

    private Book defaultBook;

    @Before
    public void setup() {
        defaultBook = new Book(null, "Asimov", "Foundation", 350);
    }

    @Test
    public void testShouldReturnCreatedWhenValidBook() {
        ResponseEntity<Book> bookResponseEntity = this.restTemplate.postForEntity("/books", defaultBook, Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.CREATED);
        assertThat(bookResponseEntity.getBody().getId()).isNotNull();
        assertThat(bookRepository.findById(1L)).isPresent();
    }

    @Test
    public void testShouldFindBooksWhenExists() throws Exception {
        Book savedBook = bookRepository.save(defaultBook);

        ResponseEntity<Book> bookResponseEntity = this.restTemplate.getForEntity("/books/" + savedBook.getId(), Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.OK);
        assertThat(bookResponseEntity.getBody().getId()).isEqualTo(savedBook.getId());
    }

    @Test
    public void testShouldReturn404WhenBookMissing() throws Exception {
        Long nonExistingId = 999L;
        ResponseEntity<Book> bookResponseEntity = this.restTemplate.getForEntity("/books/" + nonExistingId, Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.NOT_FOUND);
    }
}

Integration testing of web layer (controllers)

Spring Boot offers the ability to test layers in isolation and only starting the necessary beans that are required for testing. From Spring Boot v1.4 on there is a very convenient annotation @WebMvcTest that only the required components in order to do a typical web layer test like controllers, Jackson converters and similar without starting the full application context and avoid startup of unnecessary components for this test like database layer. When we are using this annotation we will be making the REST calls with MockMvc class.

Following is an example of testing the same endpoints like in the above example, but in this case, we are only testing if the web layer is working as expected and we are mocking the database layer using @MockBean annotation which is also available starting from Spring Boot v1.4. Using these annotations we are only using BookController in the application context and mocking database layer.

@RunWith(SpringRunner.class)
@WebMvcTest(BookController.class)
public class BookControllerTest {
    @Autowired
    private MockMvc mockMvc;

    @MockBean
    private BookRepository repository;

    @Autowired
    private ObjectMapper objectMapper;

    private static final Book DEFAULT_BOOK = new Book(null, "Asimov", "Foundation", 350);

    @Test
    public void testShouldReturnCreatedWhenValidBook() throws Exception {
        when(repository.save(Mockito.any())).thenReturn(DEFAULT_BOOK);

        this.mockMvc.perform(post("/books")
                .content(objectMapper.writeValueAsString(DEFAULT_BOOK))
                .contentType(MediaType.APPLICATION_JSON)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isCreated())
                .andExpect(MockMvcResultMatchers.jsonPath("$.name").value(DEFAULT_BOOK.getName()));
    }

    @Test
    public void testShouldFindBooksWhenExists() throws Exception {
        Long id = 1L;
        when(repository.findById(id)).thenReturn(Optional.of(DEFAULT_BOOK));

        this.mockMvc.perform(get("/books/" + id)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isOk())
                .andExpect(MockMvcResultMatchers.content().string(Matchers.is(objectMapper.writeValueAsString(DEFAULT_BOOK))));
    }

    @Test
    public void testShouldReturn404WhenBookMissing() throws Exception {
        Long id = 1L;
        when(repository.findById(id)).thenReturn(Optional.empty());

        this.mockMvc.perform(get("/books/" + id)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isNotFound());
    }
}

Integration testing of database layer (repositories)

Similarly to the way that we tested web layer we can test the database layer in isolation, without starting the web layer. This kind of testing in Spring Boot is achieved using the annotation @DataJpaTest. This annotation will do only the auto-configuration related to JPA layer and by default will use an in-memory database because its fastest to startup and for most of the integration tests will do just fine. We also get access TestEntityManager which is EntityManager with supporting features for integration tests of JPA.

Following is an example of testing the database layer of the above application. With these tests we are only checking if the database layer is working as expected we are not making any REST calls and we are verifying results from BookRepository, by using the provided TestEntityManager.

@RunWith(SpringRunner.class)
@DataJpaTest
public class BookRepositoryTest {
    @Autowired
    private TestEntityManager entityManager;

    @Autowired
    private BookRepository repository;

    private Book defaultBook;

    @Before
    public void setup() {
        defaultBook = new Book(null, "Asimov", "Foundation", 350);
    }

    @Test
    public void testShouldPersistBooks() {
        Book savedBook = repository.save(defaultBook);

        assertThat(savedBook.getId()).isNotNull();
        assertThat(entityManager.find(Book.class, savedBook.getId())).isNotNull();
    }

    @Test
    public void testShouldFindByIdWhenBookExists() {
        Book savedBook = entityManager.persistAndFlush(defaultBook);

        assertThat(repository.findById(savedBook.getId())).isEqualTo(Optional.of(savedBook));
    }

    @Test
    public void testFindByIdShouldReturnEmptyWhenBookNotFound() {
        long nonExistingID = 47L;
        
        assertThat(repository.findById(nonExistingID)).isEqualTo(Optional.empty());
    }
}

Conclusion

You can find a working example with all of these tests on the following repo: https://gitlab.com/47northlabs/public/spring-boot-testing.

In the following table, I’m showing the execution times with the startup of the different types of tests that I’ve used as examples. We can clearly see that unit tests, as mentioned in the beginning, are the fastest ones and that separating integration tests into layered testing leads to faster execution times.

Type of testExecution time with startup
Unit test80 ms
Integration test620 ms
Web layer test190 ms
Database layer test220 ms

Generate Spring Boot REST API using Swagger/OpenAPI

Reading Time: 5 minutes

Writing API definition is pretty cool stuff. It helps consumers to understand the API and agree on its attributes. In our company for that purpose we are using OpenAPI Specification (formerly Swagger Specification).

But the real deal is generating code and documentation from the specification file. In this blog, I will show you how we are doing that at N47.

We will split this blog into two parts. The first part will be generating code, and the second part will be using the generated code.

Part 1

We are creating an empty maven project named “demo-specification”.

Next thing is creating an API definition file, api.yaml in src/main/resources/ directory. The demo content of this file is:

openapi: "3.0.0"
info:
  description: "Codegen for demo service"
  version: "0.0.1"
  title: "Demo Service Specification"
  contact:
    email: "antonie.zafirov@north-47.com"
tags:
  - name: "user"
    description: "User tag for demo purposes"
servers:
  - url: http://localhost:8000/
    description: "local host"
paths:
  /user/{id}:
    get:
      tags:
        - "user"
      summary: "Retrieves User by ID"
      operationId: "getUserById"
      parameters:
        - name: "id"
          in: "path"
          description: "retrieves user by user id"
          required: true
          schema:
            type: "integer"
            format: "int64"
      responses:
        200:
          description: "Retrieves family members by person id"
          content:
            application/json:
              schema:
                type: "object"
                $ref: '#/components/schemas/User'
components:
  schemas:
    User:
      type: "object"
      required:
        - "id"
        - "firstName"
        - "lastName"
        - "dateOfBirth"
        - "gender"
      properties:
        id:
          type: "integer"
          format: "int64"
        firstName:
          type: "string"
          example: "John"
        lastName:
          type: "string"
          example: "Smith"
        dateOfBirth:
          type: "string"
          example: "1992-10-05"
        gender:
          type: "string"
          enum:
            - "MALE"
            - "FEMALE"
            - "UNKNOWN"

Next step is updating pom.xml file

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.47northlabs</groupId>
    <artifactId>demo-specification</artifactId>
    <version>0.0.1-SNAPSHOT</version>

    <properties>
        <swagger-annotations-version>1.5.22</swagger-annotations-version>
        <jersey-version>2.27</jersey-version>
        <jackson-version>2.8.9</jackson-version>
        <jodatime-version>2.7</jodatime-version>
        <maven-plugin-version>1.0.0</maven-plugin-version>
        <junit-version>4.8.1</junit-version>
        <springfox-version>2.9.2</springfox-version>
        <threetenbp-version>1.3.8</threetenbp-version>
        <datatype-threetenbp-version>2.6.4</datatype-threetenbp-version>
        <spring-boot-starter-test-version>2.1.1.RELEASE</spring-boot-starter-test-version>
        <spring-boot-starter-web-version>2.1.0.RELEASE</spring-boot-starter-web-version>
        <junit-version>4.12</junit-version>
        <migbase64-version>2.2</migbase64-version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>io.swagger</groupId>
            <artifactId>swagger-annotations</artifactId>
            <version>${swagger-annotations-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.core</groupId>
            <artifactId>jersey-client</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.media</groupId>
            <artifactId>jersey-media-multipart</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.media</groupId>
            <artifactId>jersey-media-json-jackson</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.jaxrs</groupId>
            <artifactId>jackson-jaxrs-base</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-core</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-annotations</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.jaxrs</groupId>
            <artifactId>jackson-jaxrs-json-provider</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.datatype</groupId>
            <artifactId>jackson-datatype-joda</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>${jodatime-version}</version>
        </dependency>
        <dependency>
            <groupId>com.brsanthu</groupId>
            <artifactId>migbase64</artifactId>
            <version>${migbase64-version}</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>${junit-version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <version>${spring-boot-starter-test-version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
            <version>${spring-boot-starter-web-version}</version>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger2</artifactId>
            <version>${springfox-version}</version>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger-ui</artifactId>
            <version>${springfox-version}</version>
        </dependency>
        <dependency>
            <groupId>org.threeten</groupId>
            <artifactId>threetenbp</artifactId>
            <version>${threetenbp-version}</version>
        </dependency>
        <dependency>
            <groupId>com.github.joschi.jackson</groupId>
            <artifactId>jackson-datatype-threetenbp</artifactId>
            <version>${datatype-threetenbp-version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.openapitools</groupId>
                <artifactId>openapi-generator-maven-plugin</artifactId>
                <version>3.3.4</version>
                <executions>
                    <execution>
                        <id>spring-boot-api</id>
                        <goals>
                            <goal>generate</goal>
                        </goals>
                        <configuration>
                            <inputSpec>${project.basedir}/src/main/resources/api.yaml</inputSpec>
                            <generatorName>spring</generatorName>
                            <configOptions>
                                <dateLibrary>joda</dateLibrary>
                            </configOptions>
                            <library>spring-boot</library>
                            <apiPackage>com.northlabs.demo.api</apiPackage>
                            <modelPackage>com.northlabs.demo.api.model</modelPackage>
                            <invokerPackage>com.northlabs.demo.api.handler</invokerPackage>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.6.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-deploy-plugin</artifactId>
                <version>2.8.1</version>
                <executions>
                    <execution>
                        <id>default-deploy</id>
                        <phase>deploy</phase>
                        <goals>
                            <goal>deploy</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

After that, we are executing MVN clean install in the root directory of the project. The result is in target/generated-sources/. com.northlabs.demo.api.UserApi generated API interface is what we need.

The magic is done by openapi-generator-maven-plugin. There are a lot of different generators that can be used, with a lot of options. Here is the list of them.

Part 2

Let’s create a new spring boot project demo-service from https://start.spring.io/.

What we need to do is to add demo-specification as a maven dependency in the demo-service project.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-parent</artifactId>
		<version>2.1.4.RELEASE</version>
		<relativePath/> <!-- lookup parent from repository -->
	</parent>
	<groupId>com.47northlabs</groupId>
	<artifactId>demo-service</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<name>demo-service</name>
	<description>Demo project for Spring Boot</description>

	<properties>
		<java.version>1.8</java.version>
	</properties>

	<dependencies>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>com.47northlabs</groupId>
			<artifactId>demo-specification</artifactId>
			<version>0.0.1-SNAPSHOT</version>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<plugin>
				<groupId>org.springframework.boot</groupId>
				<artifactId>spring-boot-maven-plugin</artifactId>
			</plugin>
		</plugins>
	</build>

</project>

In application.properties file we are setting server.port to 8000.

server.port=8000

Next step is creating a class UserRestController which will implement previously generated UserApi from demo-specification.

package com.northlabs.demoservice.rest.controller;

import com.northlabs.demo.api.UserApi;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class UserRestController implements UserApi {
}

Now, if we run the application and try to make GET request to /user/1 the response status will be 501 Not Implemented.

Let’s make some simple implementation of the API:

package com.northlabs.demoservice.rest.controller;

import com.northlabs.demo.api.UserApi;
import com.northlabs.demo.api.model.User;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class UserRestController implements UserApi {

    @Override
    public ResponseEntity<User> getUserById(@PathVariable("id") Long id) {
        User user = new User();
        user.setId(id);
        user.setFirstName("John");
        user.setLastName("Doe");
        user.setGender(User.GenderEnum.MALE);
        user.setDateOfBirth("01-01-1970");
        return ResponseEntity.ok(user);
    }
}

Now the response will be:

And we are done!

This is how we are implementing OpenAPI/Swagger in our projects.
In the next blog, I will show you how you can provide Swagger UI, generate Java client, JavaScript client modify base paths etc.

Download the source code

Both projects are freely available on our GitLab repository. Feel free to fix any mistakes and to comment here if you have any questions or feedback.

https://gitlab.com/47northlabs/public/openapi-codegen-demo/demo-specification

https://gitlab.com/47northlabs/public/openapi-codegen-demo/demo-service

Spring I/O, The Conference in Barcelona – 2019

Reading Time: 2 minutes

Spring I/O is the conference, which is leading the European Conference for the Spring Framework ecosystem. This year it will be the 8th edition and take place in Barcelona, Spain between 16 to 17 May and I’m going to attend it for the first time. This conference is also my first conference for this year, so I’m very excited 😊 about it.

Preparation

Initial preparation is done as mentioned below:

  • Ticket booking, The Conference ✔️
  • Flight booking, Zürich to Barcelona ✔️
  • Hotel booking ✔️

The Conference will take place in Palau de Congressos de Catalunya, Barcelona.

Location Palau de Congressos de Catalunya on Google Maps
The entrance

Topics

Detailed agenda and topics will be available here. But I’m interested in below-mentioned topics:

  • The State of Java Relational Persistence
  • Configuration Management with Kubernetes, a Spring Boot use-case
  • Moving beyond REST: GraphQL and Java & Spring
  • Spring Framework 5.2: Core Container Revisited
  • JUnit 5: what’s new and what’s coming
  • Migrating a modern spring web application to serverless
  • Relational Persistence with Spring Data JDBC [Workshop]
  • Clean Architecture with Spring
  • How to secure your Spring apps with Keycloak
  • Boot Loot – up your game and Spring like the pros
  • Spring Boot with Kotlin, Kofu and Coroutines
  • Multi-Service Reactive Streams Using Spring, Reactor, and RSocket
  • Zero Downtime Migrations with Spring Boot

Apart from the conference, I am planning to visit Font Màgica de Montjuïc, which is near to the conference venue.

I’m open to further suggestions regarding my visit to Barcelona. What else should I visit? Is there any special food that I should try?

Spring Cloud Stream (event-driven microservice) with Apache Kafka… in 15 Minutes!

Reading Time: 5 minutes

Introduction

In March 2019 Shady and I visited Voxxed Days Romania in Bucharest. If you haven’t seen our post about that, check it out now! There were some really cool talks and so I decided to pick one and write about it.

At my previous employer, we switched from a monolithic service to a microservice architecture. After implementing about 20 different microservices in 2 years, the communication between them got more complex. In addition to that, all microservices where communicating synchronously! Did we build another monolith? I just recently read a blog post about that on another site: https://thenewstack.io/synchronous-rest-turns-microservices-back-monoliths/

I tried to recreate the complexity of synchronous communication in microservices in this picture 😅

So back to the topic… This is why I always was interested in asynchronous communication (streams, message bus, pubsub, whatever). I heard a lot from Uber using Google Clouds PubSub, how it’s highly scalable, asynchronous and most important: just cool to use! I was inspired by Mark Heckler’s talk “Drinking from the Stream: How to Use Messaging Platforms for Scalability&Performance” and tried it out myself. Of course, I’m sharing my experiences and example with you…

Technologies

Spring Cloud Stream

Spring Cloud Stream is a framework for building highly scalable event-driven microservices connected with shared messaging systems.

https://spring.io/projects/spring-cloud-stream#overview

Spring Cloud Stream supports a variety of binder implementations:

We will use Spring Cloud Stream to create 3 different projects (microservices), with the Apache Kafka Binder using the Spring Initializr.

Documentation

https://cloud.spring.io/spring-cloud-static/spring-cloud-stream/2.1.2.RELEASE/single/spring-cloud-stream.html

Kafka

Apache Kafka is a distributed streaming platform. Communication between endpoints is driven by messaging-middleware parties like Apache Kafka or RabbitMQ.

Documentation

https://kafka.apache.org/documentation/

Let’s get started!

Prerequisites

So this is all you need to get yourself started:

  • Maven 3.2+
  • Java 7+ (Java 8 highly recommended!)
  • Docker

The idea: Money money money 💰

Let’s build a money-printing machine 🤑! So the idea is…

  • Producer
    • Prints money (coins and notes) in different currencies, values and qualities.
  • Processor
    • Fetch money and polish coins/notes to”perfect” quality. This is quality assurance 😉.
  • Consumer
    • Fetch (spend) money and show type, currency, value and quality.
Three microservices communicating through Kafka

Bootstrap your application with Spring Initializr

Create a new project just with a few clicks 🖱

  • Project: Maven Project
  • Language: Java
  • Spring Boot: 2.1.4
  • Project Metadata
    • Group: com.47northlabs
    • Artefact: moneyprinter-producer
  • Dependencies
    • Web
    • Cloud Stream
    • Kafka
    • Lombok
Screenshot from my setup in the Spring Initializr

Implementation of the producer

Create or edit /src/main/resources/application.properties

server.port=0

spring.cloud.stream.bindings.output.destination=processor
spring.cloud.stream.bindings.output.group=processor

spring.cloud.stream.kafka.binder.auto-add-partitions=true
spring.cloud.stream.kafka.binder.min-partition-count=4

The destination defines to which pipeline (or topic) the message is published to.

Create or edit /src/main/java/com/northlabs/lab/moneyprinterproducer/MoneyprinterProducerApplication.java

package com.northlabs.lab.moneyprinterproducer;

import lombok.AllArgsConstructor;
import lombok.Data;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.messaging.Source;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import org.springframework.messaging.support.MessageBuilder;

import java.util.Random;
import java.util.UUID;

@SpringBootApplication
public class MoneyprinterProducerApplication {

	public static void main(String[] args) {
		SpringApplication.run(MoneyprinterProducerApplication.class, args);
	}

}

@EnableBinding(Source.class)
@EnableScheduling
@AllArgsConstructor
class Spammer {
	private final Source source;
	private final SubscriberGenerator generator;

	@Scheduled(fixedRate = 1000)
	private void spam() {
		Money money = generator.printMoney();
		System.out.println(money);
		source.output().send(MessageBuilder.withPayload(money).build());
	}
}

@Component
class SubscriberGenerator {
	private final String[] type = "Coin, Note".split(", ");
	private final String[] currency = "CHF, EUR, USD, JPY, GBP".split(", ");
	private final String[] value = "1, 2, 5, 10, 20, 50, 100, 200, 500, 1000".split(", ");
	private final String[] quality = "poor, fair, good, premium, flawless, perfect".split(", ");
	private final Random rnd = new Random();
	private int i = 0, j = 0, k=0, l=0;

	Money printMoney() {
		i = rnd.nextInt(2);
		j = rnd.nextInt(5);
		k = rnd.nextInt(10);
		l = rnd.nextInt(6);

		return new Money(UUID.randomUUID().toString(), type[i], currency[j], value[k], quality[l]);
	}
}

@Data
@AllArgsConstructor
class Money {
	private final String id, type, currency, value, quality;
}

Here we simply create the whole microservice in one class. The most important code is highlighted here. SUPER SIMPLE! Now you already have a microservice, which prints money and publishes it to the destination topic/pipeline “processor” 👏.

Implementation Processor

https://gitlab.com/47northlabs/public/spring-cloud-stream-money/blob/master/moneyprinter-processor/src/main/resources/application.properties

https://gitlab.com/47northlabs/public/spring-cloud-stream-money/blob/master/moneyprinter-processor/src/main/java/com/northlabs/lab/moneyprinterprocessor/MoneyprinterProcessorApplication.java

Implementation Consumer

https://gitlab.com/47northlabs/public/spring-cloud-stream-money/blob/master/moneyprinter-consumer/src/main/resources/application.properties

https://gitlab.com/47northlabs/public/spring-cloud-stream-money/blob/master/moneyprinter-consumer/src/main/java/com/northlabs/lab/moneyprinterconsumer/MoneyprinterConsumerApplication.java

Docker for Kafka and Zookeeper

Run these commands to create a network and run Kafka and Zookeeper in docker containers.

docker network create kafka

docker run -d --net=kafka --name=zookeeper -e ZOOKEEPER_CLIENT_PORT=2181 confluentinc/cp-zookeeper:5.0.0
docker run -d --net=kafka --name=kafka -p 9092:9092 -e KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092 -e KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 confluentinc/cp-kafka:5.0.0

If you can’t connect, add this line to /etc/hosts to ensure proper routing to container network “kafka”:

127.0.0.1 kafka

Start messaging platforms with the docker start command:

docker start zookeeper
docker start kafka

It’s a wrap!

Congratulations! You made it. Now just run your producer, processor and consumer and it should look something like this:

My example

Getting started

  1. Run docker/runKafka.sh
  2. Run docker/startMessagingPlatforms.sh
  3. Start producer, processor and consumer microservice (e.g. inside IntelliJ)
  4. Enjoy the log output 👨‍💻📋

Download the source code

The whole project is freely available on our Gitlab repository. Feel free to fix any mistakes and to comment here if you have any questions or feedback.

https://gitlab.com/47northlabs/public/spring-cloud-stream-money

Live from JPoint, Moscow 2019

Reading Time: 3 minutes

The conference took place at the World Trade Center in Moscow and started at 9 am. It looked like it will be huge from the beginning, well organized and big conference halls. The first step was an attendee registration.

After completing the registration and picking up some welcome packages, we had some starting coffee break and drinks. Also, we had visited most of the big company representative stands, that were in front of the conference halls. You can find interesting free materials there, like stickers, manuals and packages from the company you are visiting.

The next step was the conference. There were four conference halls, each one with different speakers. The opening talk was made by Anton Keks from Codeborne on the topic The world needs full-stack craftsmen.

After the opening ceremony talk, the conference started with different speakers on every track. Some of them were Russian speakers, so we focused on the English ones. Every talk was one to one and a half hour long and after that was a coffee break in the lounge room. There were also two lunch breaks included. In the end, the party at 20:00. You can check the full schedule here.

Day two was completely the same setup, some different speakers or the same one with a different topic. In general, the whole organization of the conference was amazing, like it should be for a world-class event. I highly recommend visiting if you have a chance.

Stay tuned for my next part where I will describe my opinion of the talks that I have visited…