CQRS and Event Sourcing with Axon Framework

Reading Time: 5 minutes

What is CQRS?

CQRS stands for Command Query Responsibility Segregation. It is a pattern where allowed actions in the application are divided into two groups: Commands and Queries. A command changes the state of an object but does not return any data, while a query returns data and does not change any state. This design style comes from the need for different strategies when scaling the reading part and the writing part of our application. By dividing methods into these two categories, you can use a different model to update information than the model you use to read information. By separate models we most commonly mean different object models, probably running in different logical processes, perhaps on separate hardware. A web example would see a user looking at a web page that’s rendered using the query model. If they initiate a change that change is routed to the separate command model for processing, the resulting change is communicated to the query model to render the updated state.

Event Sourcing

Event Sourcing is a specialized pattern for data storage. Instead of storing the current state for an entity, every change of state is stored as a separate event that makes sense to a business user. The current state is calculated by applying all events that changed the state of an entity. In terms of CQRS, the events stored are the results of executing a command against an aggregate on the right side. The event store also transmits events that it saves. The read side can process these events and builds the targeted data sets it needs for queries.

AXON Framework

Axon is “CQRS Framework” for Java. It is an Open-source framework that provides the building blocks that CQRS requires and helps to create scalable and extensible applications while maintaining application consistency in distributed systems. It provides basic building blocks for writing aggregates, commands, queries, events, sagas, command handlers, event handlers, query handlers, repositories, communication buses and so on. Axon Framework comes with a Spring Boot Starter, making using it as easy as adding a dependency. Axon will automatically configure itself based on best practices and other dependencies you have set. Providing explicit configuration is a matter of adding a bean of the component that needs to be configured differently. Furthermore, Axon Framework integrates with Spring Cloud Discovery, Spring Messaging and Spring Cloud AMQP.

AXON components

  • Command – a combination of expressed intent (which describes what you want to be done) as well as the information required to undertake action based on that intent. The Command Model is used to process the incoming command, to validate it and define the outcome. Commands are typically represented by simple and straightforward objects that contain all data necessary for a command handler to execute it
  • Command Bus – receives commands and routes them to the Command Handlers
  • Command Handler – the component responsible for handling commands of a certain type and taking action based on the information contained inside it. Each command handler responds to a specific type of command and executes logic based on the contents of the command
  • Event bus – dispatches events to all interested event listeners. This can either be done synchronously or asynchronously. Asynchronous event dispatching allows the command execution to return and hand over control to the user, while the events are being dispatched and processed in the background. Not having to wait for event processing to complete makes an application more responsive. Synchronous event processing, on the other hand, is simpler and is a sensible default. By default, synchronous processing will process event listeners in the same transaction that also processed the command
  • Event Handler – are the components that act on incoming events. They typically execute logic based on decisions that have been made by the command model. Usually, this involves updating view models or forwarding updates to other components, such as 3rd party integrations
  • Query Handler – components act on incoming query messages. They typically read data from the view models created by the Event listeners
  • Query Bus receives queries and routes them to the Query Handlers. A query handler is registered at the query bus with both the type of query it handles as well as the type of response it provides. Both the query and the result type are typically simple, read-only DTO objects

Will each application benefit from Axon?

Unfortunately not. Simple CRUD (Create, Read, Update, Delete) applications which are not expected to scale will probably not benefit from CQRS or Axon. Fortunately, there is a wide variety of applications that do benefit from Axon.
Axon platform is being used by a wide range of companies in highly demanding sectors such as healthcare, banking, insurance, logistics and public sector. Axon Framework is free and the source code is available in a Git repository: https://github.com/AxonFramework.

JMeter

Reading Time: 4 minutes

Today, we are gonna take a look at JMeter. You can embed it in your application as a library and make an external integration testing solution. You don’t have to use it for load testing, it could simply send onej request, check the return status code, check the return value and move on. There is an argument that JMeter may be overkill for that, but it provides an easy way to verify the return, allows you to set it up using JMeter desktop app and then you can move into testing latency under load.

First, we need to create a test file that will be put later in our spring boot application. The steps for creating the .jmx file are as follows:

1 – Open the JMeter window by clicking on /home/apache-jmeter-5.1.1/bin/jmeter.bat. The next step you want to do with every JMeter Test Plan is to add a thread group element. Set the “Loop Count” parameter equal to 1, as shown below:

2 – The next step is to create a while controller. The purpose the while controller is to repeat a given set of actions until the condition is broken. While is a basic programming concept to run actions where the number of iterations to run are unknown or varying.

3 – Create an HTTP request as shown in the figure below:

4 – Afterwards, we are gonna create a CSV Data Set Config. This step refers to the CSV file for which the partner users will be collected and replaced as in the httprequest.

5- After running our test, we want to see the results e.g. which calls have been done, and which ones have failed. This is done through Listeners, a recording mechanism that shows results, including logging and debugging.

The View Results Tree is the most common Listener.

Right-click – Add->Listener->View Results Tree

6 – At the end, it should be something like the figure below:

Now click ‘Save’. Your test will be saved as a .jmx file. To run the test, click the green arrow on top. After the test completes running, you can view the results on the Listener as in the figure below. In this example, you can see the tests were successful because they’re green. On the right, you can see more detailed results, like load time, connect time, errors, the request data, the response data, etc. You can also save the results if you want to.

JMeter can also be used for Maven testing through a plugin and work quite nicely with variables and prerequisites etc. Integrating Performance Testing in your projects has many benefits:

  • It provides a constant check of the performances of the application
  • Secures continuous delivery in production
  • Allows early detection of performance problems or performance regressions
  • Automates the process means less manual work, allowing your team to focus on more valuable tasks like performance analysis and optimisation

First of all, you need to add your plugin to the project. So go to Maven project directory (jmeter-testproject in this case) and edit the pom.xml file. Here you must add the plugin. You can find the basic configuration here. You just need to copy the configuration text and paste it in your pom.xml file.

Finally, you have a pom.xml file that looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<plugin>
   <groupId>com.lazerycode.jmeter</groupId>
   <artifactId>jmeter-maven-plugin</artifactId>
   <version>2.9.0</version>
   <executions>
      <execution>
         <id>jmeter-tests</id>
         <phase>test</phase>
         <goals>
            <goal>jmeter</goal>
         </goals>
      </execution>
      <execution>
         <id>jmeter-check-results</id>
         <phase>test</phase>
         <goals>
            <goal>results</goal>
         </goals>
      </execution>
   </executions>
   <configuration>
      <testFilesDirectory>src/test/resources/jmeter</testFilesDirectory>
      <ignoreResultFailures>false</ignoreResultFailures>
   </configuration>
</plugin>

In the code above we will be running 2 goals:

  • jmeter in phase Test: This goal will run the load test and generate the HTML report
  • results in phase Test: This goal runs some verification on error rate and fails the build

Unit testing with Mockito

Reading Time: 4 minutes

A unit is the smallest testable part of an application. Mockito is a well known mock framework that allows us to create configure mock objects. With Mockito we can mock both interfaces and classes in the class under test. Mockito also helps us to reduce the number of boilerplate code while using mockito annotations.

Adding Mockito to the project

  • using gradle
testCompile "org.mockito:mockito−core:2.7.7"
  • using maven
<dependency>
      <groupId>org.mockito</groupId>
      <artifactId>mockito-core</artifactId>
      <version>2.7.7</version>
      <scope>test</scope>
</dependency>

Mockito annotations

  • @Mock – used for mock creation.
  • @Spy – creates a spy object.
  • @InjectMocks – instantiates the tested object and injects all the annotated field dependencies into it
  • @Captor – used to capture argument values for further assertions

Mockito example @Mock

Let’s say we have the following classes and we want to write a test for the CalculationService:

public class CalculationService {

   private AddService addService;
  
   public int calculate(int x, int y) {
       return addService.add(x, y);
   }
}

public class AddService {

   public int add(int x, int y) {
       return x+y;
   }
}

The usage of the @Mock and @InjectMock annotations is shown in the following sample code:

@InjectMocks
private CalculationService calculationService;

@Mock
private AddService addService;

@Before
public void setUp() {
   // initializes objects annotated with @Mock, @Spy, @Captor, or @InjectMocks
   MockitoAnnotations.initMocks(this);
}

@Test
public void testCalculationService() {
    // mock the result from method add in addService
    doReturn(20).when(addService).add(10, 10);

    // verify that the calculate method from calculationService will return the same value
    assertEquals(20, calculationService.calculate(10, 10));
}

@Spy

Mockito spy is used to spying on a real object. The main difference between a spy and mock is that with spy the tested instance will behave as a normal instance. The following example will explain it:

@Test
public void testSpyInstance() {
    List<String> spyList = spy(new ArrayList());
    spyList.add("firstElement");
    spyList.add("secondElement");
    verify(spyList).add("firstElement");
    verify(spyList).add("secondElement");

    assertEquals(2, spyList.size());
}

Note that method add is called and the size of the spy list is 2.

@Captor

Mockito framework gives us plenty of useful annotations. One of the most recent that I’ve had a chance to use is @Captor. ArgumentCaptor is used to capture the inner data in a method that is either void or returns a different type of object.
Let’s say we have the following method snippet:

public class AnyClass {
    public void doSearch(SearchData searchData) {
        CustomData data = new CustomData("custom data");
        searchData.doSomething(data);
    }
}

We want to capture the argument data so we can verify its inner data. So, to check that, we can use ArgumentCaptor from Mockito:

// Create a mock of the SearchData
SearchData data = mock(SearchData.class);

// Run the doSearch method with the mock
new AnyClass().doSearch(data);

// Capture the argument of the doSomething function
ArgumentCaptor<CustomData> captor = ArgumentCaptor.forClass(CustomData.class);
verify(data, times(1)).doSomething(captor.capture());

// Assert the argument
CustomData actualData = captor.getValue();
assertEquals("custom data", actualData.customData);

New features in Mockito 2.x

Since its inception, Mockito lacked mocking finals. One of the major features in the 2.X version is the support stubbing of the final method and final class. This feature has to be explicitly activated by creating the file MockMaker in this directory src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker containing a single line:
mock-maker-inline

public final class MyFinalClass {

    public String hello() {
        return "my final class says hello";
    }
}

public class MyCallingClass {

    final MyFinalClass myFinalClass = new MyFinalClass();

    public String executeFinal() {
        return myFinalClass.hello();
    }
}

public class MyCallingClassTest {

    @Test
    public void testFinalClass() {
        MyCallingClass myCallingClass = new MyCallingClass();
        MyFinalClass myFinalClass = mock(MyFinalClass.java);

        when(myFinalClass.hello()).thenReturn("testString");

        assertEquals("testString", myCallingClass.executeFinal());
    }
}

Given the following example, without the file org.mockito.plugins.MockMaker and its content, we get the following error:

When the file is in the resources and the content is valid, we are all good.

The plan for the future is to have a programmatic way of using this feature.

Conclusion

In this article, I gave a brief overview of some of the features in Mockito test framework. Like any other tool, it must be used in a proper way to be useful. Now go and bring your unit tests to the next level.

Testing Spring Boot application with examples

Reading Time: 7 minutes

Why bother writing tests is already a well-discussed topic in software engineering. I won’t go into much details on this topic, but I will mention some of the main benefits.

In my opinion, testing your software is the only way to achieve confidence that your code will work on the production environment. Another huge benefit is that it allows you to refactor your code without fear that you will break some existing features.

Risk of bugs vs the number of tests

In the Java world, one of the most popular frameworks is Spring Boot, and part of the popularity and success of Spring Boot is exactly the topic of this blog – testing. Spring Boot and Spring framework offer out-of-the-box support for testing and new features are being added constantly. When Spring framework appeared on the Java scene in 2005, one of the reasons for its success was exactly this, ease of writing and maintaining tests, as opposed to JavaEE where writing integration requires additional libraries like Arquillian.

In the following, I will go over different types of tests in Spring Boot, when to use them and give a short example.

Testing pyramid

We can roughly group all automated tests into 3 groups:

  • Unit tests
  • Service (integration) tests
  • UI (end to end) tests

As we go from the bottom of the pyramid to the top tests become slower for execution, so if we measure execution times, unit tests will be in orders of few milliseconds, service in hundreds milliseconds and UI will execute in seconds. If we measure the scope of tests, unit as the name suggest test small units of code. Service will test the whole service or slice of that service that involve multiple units and UI has the largest scope and they are testing multiple different services. In the following sections, I will go over some examples and how we can unit test and service test spring boot application. UI testing can be achieved using external tools like Selenium and Protractor, but they are not related to Spring Boot.

Unit testing

In my opinion unit tests make the most sense when you have some kind of validators, algorithms or other code that has lots of different inputs and outputs and executing integration tests would take too much time. Let’s see how we can test validator with Spring Boot.

Validator class for emails

public class Validators {

    private static final String EMAIL_REGEX = "(?:[a-z0-9!#$%&amp;'*+/=?^_`{|}~-]+(?:\\.[a-z0-9!#$%&amp;'*+/=?^_`{|}~-]+)*|\"(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21\\x23-\\x5b\\x5d-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])*\")@(?:(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?|\\[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21-\\x5a\\x53-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])+)\\])";

    public static boolean isEmailValid(String email) {
        return email.matches(EMAIL_REGEX);
    }
}

Unit tests for email validator with Spring Boot

@RunWith(MockitoJUnitRunner.class)
public class ValidatorsTest {
    @Test
    public void testEmailValidator() {
        assertThat(isEmailValid("valid@north-47.com")).isTrue();
        assertThat(isEmailValid("invalidnorth-47.com")).isFalse();
        assertThat(isEmailValid("invalid@47")).isFalse();
    }
}

MockitoJUnitRunner is used for using Mockito in tests and detection of @Mock annotations. In this case, we are testing email validator as a separate unit from the rest of the application. MockitoJUnitRunner is not a Spring Boot annotation, so this way of writing unit tests can be done in other frameworks as well.

Integration testing of the whole application

If we have to choose only one type of test in Spring Boot, then using the integration test to test the whole application makes the most sense. We will not be able to cover all the scenarios, but we will significantly reduce the risk. In order to do integration testing, we need to start the application context. In Spring Boot 2, this is achieved with following annotations @RunWith(SpringRunner.class) and @SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT. This will start the application on some random port and we can inject beans into our tests and do REST calls on application endpoints.

In the following is an example code for testing book endpoints. For making rest API calls we are using Spring TestRestTemplate which is more suitable for integration tests compared to RestTemplate.

@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class SpringBootTestingApplicationTests {

    @Autowired
    private TestRestTemplate restTemplate;

    @Autowired
    private BookRepository bookRepository;

    private Book defaultBook;

    @Before
    public void setup() {
        defaultBook = new Book(null, "Asimov", "Foundation", 350);
    }

    @Test
    public void testShouldReturnCreatedWhenValidBook() {
        ResponseEntity<Book> bookResponseEntity = this.restTemplate.postForEntity("/books", defaultBook, Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.CREATED);
        assertThat(bookResponseEntity.getBody().getId()).isNotNull();
        assertThat(bookRepository.findById(1L)).isPresent();
    }

    @Test
    public void testShouldFindBooksWhenExists() throws Exception {
        Book savedBook = bookRepository.save(defaultBook);

        ResponseEntity<Book> bookResponseEntity = this.restTemplate.getForEntity("/books/" + savedBook.getId(), Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.OK);
        assertThat(bookResponseEntity.getBody().getId()).isEqualTo(savedBook.getId());
    }

    @Test
    public void testShouldReturn404WhenBookMissing() throws Exception {
        Long nonExistingId = 999L;
        ResponseEntity<Book> bookResponseEntity = this.restTemplate.getForEntity("/books/" + nonExistingId, Book.class);

        assertThat(bookResponseEntity.getStatusCode()).isEqualTo(HttpStatus.NOT_FOUND);
    }
}

Integration testing of web layer (controllers)

Spring Boot offers the ability to test layers in isolation and only starting the necessary beans that are required for testing. From Spring Boot v1.4 on there is a very convenient annotation @WebMvcTest that only the required components in order to do a typical web layer test like controllers, Jackson converters and similar without starting the full application context and avoid startup of unnecessary components for this test like database layer. When we are using this annotation we will be making the REST calls with MockMvc class.

Following is an example of testing the same endpoints like in the above example, but in this case, we are only testing if the web layer is working as expected and we are mocking the database layer using @MockBean annotation which is also available starting from Spring Boot v1.4. Using these annotations we are only using BookController in the application context and mocking database layer.

@RunWith(SpringRunner.class)
@WebMvcTest(BookController.class)
public class BookControllerTest {
    @Autowired
    private MockMvc mockMvc;

    @MockBean
    private BookRepository repository;

    @Autowired
    private ObjectMapper objectMapper;

    private static final Book DEFAULT_BOOK = new Book(null, "Asimov", "Foundation", 350);

    @Test
    public void testShouldReturnCreatedWhenValidBook() throws Exception {
        when(repository.save(Mockito.any())).thenReturn(DEFAULT_BOOK);

        this.mockMvc.perform(post("/books")
                .content(objectMapper.writeValueAsString(DEFAULT_BOOK))
                .contentType(MediaType.APPLICATION_JSON)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isCreated())
                .andExpect(MockMvcResultMatchers.jsonPath("$.name").value(DEFAULT_BOOK.getName()));
    }

    @Test
    public void testShouldFindBooksWhenExists() throws Exception {
        Long id = 1L;
        when(repository.findById(id)).thenReturn(Optional.of(DEFAULT_BOOK));

        this.mockMvc.perform(get("/books/" + id)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isOk())
                .andExpect(MockMvcResultMatchers.content().string(Matchers.is(objectMapper.writeValueAsString(DEFAULT_BOOK))));
    }

    @Test
    public void testShouldReturn404WhenBookMissing() throws Exception {
        Long id = 1L;
        when(repository.findById(id)).thenReturn(Optional.empty());

        this.mockMvc.perform(get("/books/" + id)
                .accept(MediaType.APPLICATION_JSON))
                .andExpect(status().isNotFound());
    }
}

Integration testing of database layer (repositories)

Similarly to the way that we tested web layer we can test the database layer in isolation, without starting the web layer. This kind of testing in Spring Boot is achieved using the annotation @DataJpaTest. This annotation will do only the auto-configuration related to JPA layer and by default will use an in-memory database because its fastest to startup and for most of the integration tests will do just fine. We also get access TestEntityManager which is EntityManager with supporting features for integration tests of JPA.

Following is an example of testing the database layer of the above application. With these tests we are only checking if the database layer is working as expected we are not making any REST calls and we are verifying results from BookRepository, by using the provided TestEntityManager.

@RunWith(SpringRunner.class)
@DataJpaTest
public class BookRepositoryTest {
    @Autowired
    private TestEntityManager entityManager;

    @Autowired
    private BookRepository repository;

    private Book defaultBook;

    @Before
    public void setup() {
        defaultBook = new Book(null, "Asimov", "Foundation", 350);
    }

    @Test
    public void testShouldPersistBooks() {
        Book savedBook = repository.save(defaultBook);

        assertThat(savedBook.getId()).isNotNull();
        assertThat(entityManager.find(Book.class, savedBook.getId())).isNotNull();
    }

    @Test
    public void testShouldFindByIdWhenBookExists() {
        Book savedBook = entityManager.persistAndFlush(defaultBook);

        assertThat(repository.findById(savedBook.getId())).isEqualTo(Optional.of(savedBook));
    }

    @Test
    public void testFindByIdShouldReturnEmptyWhenBookNotFound() {
        long nonExistingID = 47L;
        
        assertThat(repository.findById(nonExistingID)).isEqualTo(Optional.empty());
    }
}

Conclusion

You can find a working example with all of these tests on the following repo: https://gitlab.com/47northlabs/public/spring-boot-testing

In the following table, I’m showing the execution times with the startup of the different types of tests that I’ve used as examples. We can clearly see that unit tests, as mentioned in the beginning, are the fastest ones and that separating integration tests into layered testing leads to faster execution times.

Type of testExecution time with startup
Unit test80 ms
Integration test620 ms
Web layer test190 ms
Database layer test220 ms

Editable Templates in AEM 6.5

Reading Time: 8 minutes

Editable templates have been introduced in AEM 6.2 and since then with each next version they are constantly improving. They allow authors to create and edit templates. Template authors can create and configure templates without help of development team. To be able to create and edit templates the authors must be members of the template-authors group.

Here are some of the benefits of using editable templates:

  • editable templates provide flexibility to define content policies to persist in design properties. There is no need for design mode which requires extra permission to give author to set design properties along with a replication of design page after making any design changes
  • it maintains a dynamic connection between pages and template which gives power to template authors to change template structure along with locked content which will be reflected across all pages based on the editable template
  • this doesn’t require any extra training for authors to create a new page based on an editable template. Authors can similarly create a new page as they create with static templates
  • can be created and edited by your authors
  • after the new page is created, a dynamic connection is maintained between the page and the template. This means that changes to the template structure are reflected on any pages created with that template (changes to the initial content will bot be reflected)
  • uses content policies (edited from the template author) to persist the design properties (does not use Design mode within the page editor)
  • are stored under /conf

Here are the tasks that the template author can do with the help of the AEM’s template editor:

  • create a new template or copy an existing template
  • manage the life cycle of the template
  • add components to the template and position them on a responsive grid
  • pre-configure the components using policies
  • define which components can be edited on pages created with the template

Create editable templates from the configuration browser

Go to Tools -> General -> Configuration Browser

It will create the basic hierarchy of templates in /conf directory.

There are three parts of template editor:

  • templates: all dynamic (editable) templates created by authors
  • policies: there are two types of policies
    template level policy: used for adding policies to the template
    component level policy: used for adding policies to the component level
  • template-types: base templates for the creation of new templates in runtime

There are three parts of a template:

  • initial: the initial content of the page created – based on the template. This content could be edited or removed by the author
  • policies: here a particular template is linked to a policy by using cq:policy property
  • structure:
    – the structure allows you to define the structure of the template
    – the components defined in the template level can’t be removed from the resulting page
    – if you want that template authors can add and remove components, add parsys to the template
    – components can be locked and unlocked to allow you to define initial content

How to create base template-type

To start working on editable templates, we need to create a page component. Navigate to /apps/47northlabs-sample-project/components/page and click on create component.

Next would be to create a template-type which helps the authors to create it’s editable templates. (Please note that the configuration is available on GitLab).

How can authors create editable templates

Next step would be to create the editable template. That can be done from Tools -> General -> Templates -> 47northlabs-sample-project -> Choose empty template.

Add template title, description and open the template. There should be a responsive grid available.

Configure policies

With defining policies, authors will be able to configure different components on the page for regular authors to place on the page. Since there is no defined policy on the template, no component is assigned to this template. Click on the first (Policy) icon will take the author to a new screen where the author can define what components can put on this template.

Create new policy.

Once done, components will be available on the page to drag & drop on the page.

Enable template

Once template authors are done with creating templates, authors must enable templates to make them available in sites section where regular authors can select templates to create pages.

Create editable templates from code

Another possibility is to We can create a sample project based on Adobe Archetype using the following command:

mvn archetype:generate -DarchetypeGroupId=com.adobe.granite.archetypes -DarchetypeArtifactId=aem-project-archetype -DarchetypeVersion=19

The generated sample project already contains content-page editable template by default. We are going to create three additional editable templates i.e.

  • home-page
  • landing-page
  • language-page

We are going to add some components as default in the templates, and few pages in ui.content project. The general idea is to have some test content and to play around with some corner cases. Let’s start.

The demo content is the like on the image below.

We have four editable templates available. With property cq:allowedTemplates we control the available templates that can be used to create child pages under a given page. For example, for the content-page we want to make available only the landing-page, so the initial content will look like:

<jcr:content
    ...
    cq:allowedTemplates="[conf/aem-editable-templates-demo/settings/wcm/templates/landing-page]">
    ...
 </jcr:content>

Similar, the initial content of the landing-page will have similar configuration:

<jcr:content
    ...
    cq:allowedTemplates="[conf/aem-editable-templates-demo/settings/wcm/templates/content-page]">
    ...
</jcr:content>

What happens in case we want to add a fancy new template which should be available only for a particular page types? These two solutions come to my mind:

  • write a groovy script that will update existing cq:allowedTemplates property of all the pages created for a given template with the new template
  • update the structure of the given template, so all the existing pages created with that template will be updated

With the second approach, for example, if we want to add that fancy page template to the content page template, we have to add the following configuration to the structure element of the content-page template:

<jcr:content
    ...
    cq:allowedTemplates="[conf/aem-editable-templates-demo/settings/wcm/templates/fancy-page]">
    ...
</jcr:content>

Given this, it is important to point out the differences between initial content and structure elements of the template definition.

Initial Content

  • is merged with the structure during page creation
  • changes to the initial content will not be reflected in all pages created with the given template
  • the jcr:content node is copied to any new page
  • the root node holds a list of components that defines what will be available in the resulting page

Structure

  • is merged with the initial content during page creation
  • changes to the structure will be reflected in all pages created with the given template
  • the cq:responsive node is responsible for the responsive layout
  • the root node defines the available components in the resulting page

Conclusion

With editable template, you give template authors the flexibility to create and modify the template as they want. It acts as a central place to manage everything about the template (structure, initial content, layout) and components used in the template. If you are migrating to use editable template, make sure you accessed the requirements for not only the template but also the components.

The code from this blog post is available on GitLab.

My opinion on talks from JPoint Moscow 2019

Reading Time: 4 minutes

If you have read my previous parts, this is the last one in which I will give my highlights on the talks that I have visited.

First stop was the opening talk from Anton Keks on topic The world needs full-stack craftsmen. Interesting presentation about current problems in software development like splitting development roles and what is the real result of that. Another topic was about agile methodology and is it really helping the development teams to build a better product. Also, some words about startup companies and usual problems. In general, excellent presentation.

Simon Ritter, in my opinion, he had the best talks about JPoint. First day with the topic JDK 12: Pitfalls for the unwary. In this session, he covered the impact of application migration from previous versions of Java to the last one, from aspects like Java language syntax, class libraries and JVM options. Another interesting thing was how to choose which versions of Java to use in production. Well balanced presentation with real problems and solutions.

Next stop Kohsuke Kawaguchi, creator of Jenkins, with the topic Pushing a big project forward: the Jenkins story. It was like a story from a management perspective, about new projects that are coming up and what the demands of the business are. To be honest, it was a little bit boring for me, because I was expecting superpowers coming to Jenkins, but he changed the topic to this management story.

Sebastian Daschner from IBM, his topic was Bulletproof Java Enterprise applications. This session covered which non-functional requirements we need to be aware of to build stable and resilient applications. Interesting examples of different resiliency approaches, such as circuit breakers, bulkheads, or backpressure, in action. In the end, adding telemetry to our application and enhancing our microservice with monitoring, tracing, or logging in a minimalistic way.

Again, Simon Ritter, this time, with the topic Local variable type inference. His talk was about using var and let the compiler define the type of the variable. There were a lot of examples, when it makes sense to use it, but also when you should not. In my opinion, a very useful presentation.

Rafael Winterhalter talked about Java agents, to be more specific he covered the Byte Buddy library, and how to program Java agents with no knowledge of Java bytecode. Another thing was showing how Java classes can be used as templates for implementing highly performant code changes, that avoid solutions like AspectJ or Javassist and still performing better than agents implemented in low-level libraries.

To summarize, the conference was excellent, any Java developer would be happy to be here, so put JPoint on your roadmap for sure. Stay tuned for my next conference, thanks for reading, THE END 🙂

Java 8 Streams

Reading Time: 5 minutes

Java Stream API was added in Java 8 in order to provide a functional approach to process a collection of objects. Do not get confused with I/O Streams, Java 8 Streams are a completely different thing.

Java Stream does not store data and is not a data structure. Also the underlying data source is not modified.

Java Stream uses functional interfaces and supports functional-style operations on streams of elements using lambda expressions.

Stream operations

Stream operations are divided into intermediate and terminal operations.

Intermediate operations are Stream operations that return a new Stream. These operations are used for producing new stream elements and to send the stream to the next operation. These operations are lazy, i.e. they are not executed until the result of the processing is needed.

List of all Stream intermediate operations:

  • filter()
  • map()
  • flatMap()
  • distinct()
  • sorted()
  • peek()
  • limit()
  • skip()

Terminal operations are Stream operations that do not return a new Stream. Once the terminal method is called in a stream, it consumes the stream and after that the stream can not be used anymore. Terminal operations are processing all the elements in the stream before they return the result.

List of all Stream terminal operations:

  • toArray()
  • collect()
  • count()
  • reduce()
  • forEach()
  • forEachOrdered()
  • min()
  • max()
  • anyMatch()
  • allMatch()
  • noneMatch()
  • findAny()
  • findFirst()

Some code examples

forEach

The simplest and most common operation, it loops over the elements of the stream, calling the supplied code on each element.

Map<Integer, List<User>> usersByAge = users.stream().collect(Collectors.groupingBy(User::getAge));  
usersByAge.forEach((age, u) -> System.out.format("age %s: %s\n", age, u)); 
// age 18: [John]   
// age 23: [Alex, David]   
// age 12: [Peter]

map

Produces new stream after applying a function to each element of the existing stream. It can produce a new stream of different type.

Integer[] userIds= { 5, 2, 7 }; 
List<User> users= Stream.of(userIds).map(employeeRepository::findById) 
.collect(Collectors.toList());   
assertEquals(users.size(), userIds.length);

collect

Repacks element to new data structure on data elements from the existing stream.

List<Employee> users= usersList.stream().collect(Collectors.toList()); 
assertEquals(usersList, users); 

filter

Produces new stream that contains elements from the existing stream that are fulfilling some criteria from the predicate.

users.stream().filter(user -> user.getAge() > 20) 
.collect().collect(Collectors.toList());

findFirst

Returns an Optional for the first entry in the stream.

users.stream().filter(user -> user.getAge() > 20).findFirst().orElse(null);

findAny()

Returns an Optional from any entry in the stream. It will most likely return the first entry in the stream in a non-parallel execution, but there is no guarantee for that.

users.stream().filter(user -> user.getAge() > 20).findAny().orElse(null);

toArray

Returns array from the stream.

User[] users = usersList.stream().toArray(User[]::new); 
assertThat(usersList.toArray(), equalTo(users));

flatMap

Used to flatten the data structure to simplify further operations on the stream. Useful for complex data structures.

List<String> names1 = Arrays.asList(“Peter”, “David”); 
List<String> names2 = Arrays.asList(“Dave”, “Robert”);  
List<String> names3 = Arrays.asList(“Tom”, “Tim”);  
List<List<String>> names = Arrays.asList(names1, names2, names3);  
List<String> flatNames = names.stream().flatMap(Collection::stream).collect(Colectors.toList());

peek

Performs the specified operation on each element of the stream and returns a new stream that can be used for further processing.

User[] arrayOfUsers = {  
new User(1, "James", 100000.0), 
new User(2, "Martin", 200000.0), 
new User(3, "David", 300000.0) }; 
List<User> userList = Arrays.asList(arrayOfUsers);  
userList .stream() 
.peek(e -> e.salaryIncrement(10.0))
.peek(System.out::println) 
.collect(Collectors.toList());   
assertThat(userList, contains(
hasProperty("salary", equalTo(110000.0)),  
hasProperty("salary", equalTo(220000.0)), 
hasProperty("salary", equalTo(330000.0))   
));

sorted()

Sorted is an intermediate operation which returns a sorted view of the stream. The elements are sorted in natural order unless you pass a custom Comparator.

List<String> names = Arrays.asList("Peter", "Dave", "Mike", "David", "Pete"); 
names.stream().sorted().map(String::toUpperCase).forEach(System.out::println);  
Output:
DAVE
DAVID
MIKE
PETE
PETER

Keep in mind that sorted does only create a sorted view of the stream without manipulating the ordering of the backed collection. The ordering of names is untouched.

count()

Count is a terminal operation returning the number of elements in the stream as a long.

long total = names.stream().filter((s) -> s.startsWith("D")).count(); 
System.out.println(total);  
Output: 2

reduce()

This terminal operation performs a reduction on the elements of the stream with the given function. The result is an Optional holding the reduced value.

List<Car> cars  = Arrays.asList(new Car("Kia", 19500), 
new Car("Hyundai", 20500),  
new Car("Ford", 25000),  
new Car("Dacia", 20000)); 

Optional<Car> car = cars.stream().reduce((c1, c2) 
-> c1.getPrice() > c2.getPrice() ? c1 : c2);  
car.ifPresent(System.out::println);  
Output: Ford 25000 

anyMatch()

The method accepts Predicate as an argument. This will return true once a condition passed as predicate satisfy. It will not process any more elements.

List<String> names = Arrays.asList("Peter", "Dave", "Mike", "David", "Pete"); 
boolean matched = names.stream().anyMatch(name -> name.startsWith("D"));  
System.out.println(matched);  
Output: true

allMatch()

The method accepts Predicate as an argument. The Predicateis applied to each entry in the stream and if every element satisfies the passed Predicate, then it returns true otherwise false.

List<String> names = Arrays.asList("Peter", "Dave", "Mike", "David", "Pete"); 
boolean matched = names.stream().allMatch(name -> name.startsWith("D")); 
System.out.println(matched);   
Output: false

noneMatch()

The method accepts Predicate as an argument. The Predicate is applied to each entry in the stream and if every element does not satisfy the passed Predicate, then it returns true otherwise false.

List<String> names = Arrays.asList("Peter", "Dave", "Mike", "David", "Pete"); 
boolean matched = names.stream().noneMatch(name -> name.startsWith("S"));
System.out.println(matched); 
Output: true

Conclusion

In this article we showed some of the details of the new Stream API in Java 8. We saw various operations and how lambdas are used to reduce a huge amount of boilerplate code.


Unit testing using JSONassert library

Reading Time: 3 minutes

In this article, we’ll have a closer look at a library called JSONassert library. We will explain using some examples and how this library can be used. So, let’s get started!

Working with an easy example:

Let’s start our tests with a simple JSON string comparison:

String actual = "{objectId:123, name:\"magy\",lastName:\"henry\"}"; String expected="{objectId:123,name:\"magy\"}"; JSONAssert.assertEquals(expected, actual, false);

The above example will work for the condition strict=false. However, if it is set to true, the test will fail. You have to keep in mind, that JSONAssert makes a logical comparison of the data. This means that the ordering of elements does not matter while dealing with JSON objects.

Working with a complex example

Assuming, that you want a unit test where you have to validate (match an actual response with expected) our rest interfaces in the JUnit. Our endpoint delivers a list of objects. So, the goals are to verify the properties of each object. Let’s assume that delivered response is a list of a type called partner, where the implementation of the class partner is as following:

@Data
public class Partner {
    private String id;
    private String firstName;
    private String lastName;
    private LocalDate birthDate;
    private Gender gender;
    private MaritalStatus maritalStatus;
    private String phoneNumber;
}

The following Java examples will help you to understand the usage of. Assuming, we need to write some assertions for each family member on the list. So, the following should be done.

ResponseEntity<List<Partner>> response = restTemplate.exchange(
  "http://localhost:8080/partners/x/family",
  HttpMethod.GET,
  null,
  new ParameterizedTypeReference<List<Partner>>(){});
List<Parnter> partners= response.getBody();
//Now we need to test that each partner in the family has a certain values. 
Parnter partner1= partners.stream.filter(partner -> parnter.getid().equals("1")).findFirst().get();
assertEquals("Magy", partner1.getFirstName());
assertEquals("Mueller", partner1.getLastName());
assertEquals(of(1980, 7, 12), partner1.getBirthDate());
assertEquals(FEMALE, partner1.getGender());
assertNull(partner1.getMaritalStatus());

// Testing the second person
Parnter partner2= partners.stream.filter(partner -> parnter.getid().equals("2")).findFirst().get();
assertEquals("Marc", partner2.getFirstName());
assertEquals("Ullenstein", partner2.getLastName());
assertEquals(of(1988, 7, 13), partner2.getBirthDate());
assertEquals(MALE, partner2.getGender());
assertNull(partner1.getMaritalStatus());

So, to test the values for just one partner, it takes some time. So, in case of testing multiple partners, it will take a lot of code lines. We would like to use JSONAssert. To make it easier using JSONAssert, let’s create a JSON file under test/resources/json in which we create a list of objects, where each object contains the properties, that will be tested. You should also have in mind, that this library allows developers not having a restrict mode, which means that you don’t have to test against each property.

String actual= restTemplate.getForObject("http://localhost:8080/partners/x/family", String.class);
String expected = IOUtils.toString(this.getClass().getResourceAsStream("/json/expectedJsonResponse.json"),"UTF-8");
JSONAssert.assertEquals(expected, actual, false);

The above method takes three parameters as seen in the above example, where the first parameter is the expected JSON. The second one is the result, or what we got as a response from our endpoint. The third one defines whether we should use the strict mode or not. When comparing the code written in both cases, you are going to think about why you should use this library in future. You can give up all of the used assertions in the above example.

Conclusion

In this article, we looked at multiple scenarios in which JSONAssert can be helpful. We started with a very simple example and moved on to more complex comparisons. And, as always, stay tuned for new interesting articles.

Generate Spring Boot REST API using Swagger/OpenAPI

Reading Time: 5 minutes

Writing API definition is pretty cool stuff. It helps consumers to understand the API and agree on its attributes. In our company for that purpose we are using OpenAPI Specification (formerly Swagger Specification).

But the real deal is generating code and documentation from the specification file. In this blog I will show you how we are doing that in 47 North Labs

We will split this blog in two parts. The first part will be generating code, and the second part will be using the generated code.

Part 1

We are creating empty maven project named “demo-specification”.

Next thing is creating API definition file, api.yaml in src/main/resources/ directory. The demo content of this file is:

openapi: "3.0.0"
info:
  description: "Codegen for demo service"
  version: "0.0.1"
  title: "Demo Service Specification"
  contact:
    email: "antonie.zafirov@north-47.com"
tags:
  - name: "user"
    description: "User tag for demo purposes"
servers:
  - url: http://localhost:8000/
    description: "local host"
paths:
  /user/{id}:
    get:
      tags:
        - "user"
      summary: "Retrieves User by ID"
      operationId: "getUserById"
      parameters:
        - name: "id"
          in: "path"
          description: "retrieves user by user id"
          required: true
          schema:
            type: "integer"
            format: "int64"
      responses:
        200:
          description: "Retrieves family members by person id"
          content:
            application/json:
              schema:
                type: "object"
                $ref: '#/components/schemas/User'
components:
  schemas:
    User:
      type: "object"
      required:
        - "id"
        - "firstName"
        - "lastName"
        - "dateOfBirth"
        - "gender"
      properties:
        id:
          type: "integer"
          format: "int64"
        firstName:
          type: "string"
          example: "John"
        lastName:
          type: "string"
          example: "Smith"
        dateOfBirth:
          type: "string"
          example: "1992-10-05"
        gender:
          type: "string"
          enum:
            - "MALE"
            - "FEMALE"
            - "UNKNOWN"

Next step is updating pom.xml file

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.47northlabs</groupId>
    <artifactId>demo-specification</artifactId>
    <version>0.0.1-SNAPSHOT</version>

    <properties>
        <swagger-annotations-version>1.5.22</swagger-annotations-version>
        <jersey-version>2.27</jersey-version>
        <jackson-version>2.8.9</jackson-version>
        <jodatime-version>2.7</jodatime-version>
        <maven-plugin-version>1.0.0</maven-plugin-version>
        <junit-version>4.8.1</junit-version>
        <springfox-version>2.9.2</springfox-version>
        <threetenbp-version>1.3.8</threetenbp-version>
        <datatype-threetenbp-version>2.6.4</datatype-threetenbp-version>
        <spring-boot-starter-test-version>2.1.1.RELEASE</spring-boot-starter-test-version>
        <spring-boot-starter-web-version>2.1.0.RELEASE</spring-boot-starter-web-version>
        <junit-version>4.12</junit-version>
        <migbase64-version>2.2</migbase64-version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>io.swagger</groupId>
            <artifactId>swagger-annotations</artifactId>
            <version>${swagger-annotations-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.core</groupId>
            <artifactId>jersey-client</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.media</groupId>
            <artifactId>jersey-media-multipart</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.media</groupId>
            <artifactId>jersey-media-json-jackson</artifactId>
            <version>${jersey-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.jaxrs</groupId>
            <artifactId>jackson-jaxrs-base</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-core</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-annotations</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.jaxrs</groupId>
            <artifactId>jackson-jaxrs-json-provider</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.datatype</groupId>
            <artifactId>jackson-datatype-joda</artifactId>
            <version>${jackson-version}</version>
        </dependency>
        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>${jodatime-version}</version>
        </dependency>
        <dependency>
            <groupId>com.brsanthu</groupId>
            <artifactId>migbase64</artifactId>
            <version>${migbase64-version}</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>${junit-version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <version>${spring-boot-starter-test-version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
            <version>${spring-boot-starter-web-version}</version>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger2</artifactId>
            <version>${springfox-version}</version>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger-ui</artifactId>
            <version>${springfox-version}</version>
        </dependency>
        <dependency>
            <groupId>org.threeten</groupId>
            <artifactId>threetenbp</artifactId>
            <version>${threetenbp-version}</version>
        </dependency>
        <dependency>
            <groupId>com.github.joschi.jackson</groupId>
            <artifactId>jackson-datatype-threetenbp</artifactId>
            <version>${datatype-threetenbp-version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.openapitools</groupId>
                <artifactId>openapi-generator-maven-plugin</artifactId>
                <version>3.3.4</version>
                <executions>
                    <execution>
                        <id>spring-boot-api</id>
                        <goals>
                            <goal>generate</goal>
                        </goals>
                        <configuration>
                            <inputSpec>${project.basedir}/src/main/resources/api.yaml</inputSpec>
                            <generatorName>spring</generatorName>
                            <configOptions>
                                <dateLibrary>joda</dateLibrary>
                            </configOptions>
                            <library>spring-boot</library>
                            <apiPackage>com.northlabs.demo.api</apiPackage>
                            <modelPackage>com.northlabs.demo.api.model</modelPackage>
                            <invokerPackage>com.northlabs.demo.api.handler</invokerPackage>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.6.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-deploy-plugin</artifactId>
                <version>2.8.1</version>
                <executions>
                    <execution>
                        <id>default-deploy</id>
                        <phase>deploy</phase>
                        <goals>
                            <goal>deploy</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

After that, we are executing mvn clean install in the root directory of the project. The result is in target/generated-sources/ . com.northlabs.demo.api.UserApi generated API interface is what we need.

The magic is done by openapi-generator-maven-plugin. There are a lot of different generators that can be used, with a lot of options. Here is the list of them.

Part 2

Let’s create new spring boot project demo-service from https://start.spring.io/ .

What we need to do is to add demo-specification as a maven dependency in the demo-service project.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-parent</artifactId>
		<version>2.1.4.RELEASE</version>
		<relativePath/> <!-- lookup parent from repository -->
	</parent>
	<groupId>com.47northlabs</groupId>
	<artifactId>demo-service</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<name>demo-service</name>
	<description>Demo project for Spring Boot</description>

	<properties>
		<java.version>1.8</java.version>
	</properties>

	<dependencies>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>

		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>com.47northlabs</groupId>
			<artifactId>demo-specification</artifactId>
			<version>0.0.1-SNAPSHOT</version>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<plugin>
				<groupId>org.springframework.boot</groupId>
				<artifactId>spring-boot-maven-plugin</artifactId>
			</plugin>
		</plugins>
	</build>

</project>

In application.properties file we are setting server.port to 8000.

server.port=8000

Next step is creating a class UserRestController which will implement previously generated UserApi from demo-specification.

package com.northlabs.demoservice.rest.controller;

import com.northlabs.demo.api.UserApi;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class UserRestController implements UserApi {
}

Now, if we run the application and try to make GET request to /user/1 the response status will be 501 Not Implemented.

Let’s make some simple implementation of the API.

package com.northlabs.demoservice.rest.controller;

import com.northlabs.demo.api.UserApi;
import com.northlabs.demo.api.model.User;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class UserRestController implements UserApi {

    @Override
    public ResponseEntity<User> getUserById(@PathVariable("id") Long id) {
        User user = new User();
        user.setId(id);
        user.setFirstName("John");
        user.setLastName("Doe");
        user.setGender(User.GenderEnum.MALE);
        user.setDateOfBirth("01-01-1970");
        return ResponseEntity.ok(user);
    }
}

Now the response will be:

And we are done.

This is how we are implementing OpenAPI/Swagger in our projects.
In the next blog I will show you how you can provide Swagger UI, generate Java client, JavaScript client modify base paths etc.

Download the source code

Both projects are freely available on our Gitlab repository. Feel free to fix any mistakes and to comment here if you have any questions or feedback.

https://gitlab.com/47northlabs/public/openapi-codegen-demo/demo-specification

https://gitlab.com/47northlabs/public/openapi-codegen-demo/demo-service

Spring I/O, The Conference in Barcelona – 2019

Reading Time: 2 minutes

Spring I/O is the conference, which is leading the European Conference for the Spring Framework ecosystem. This year it will be the 8th edition and take place in Barcelona, Spain between 16 to 17 May and I’m going to attend it for the first time. This conference is also my first conference for this year, so I’m very excited 😊 about it.

Preparation

Initial preparation is done as mentioned below:

  • Ticket booking, The Conference ✔️
  • Flight booking, Zürich to Barcelona ✔️
  • Hotel booking ✔️

The Conference will take place in Palau de Congressos de Catalunya, Barcelona.

Location Palau de Congressos de Catalunya on Google Maps
The entrance

Topics

Detailed agenda and topics will be available here. But I’m interested in below-mentioned topics:

  • The State of Java Relational Persistence
  • Configuration Management with Kubernetes, a Spring Boot use-case
  • Moving beyond REST: GraphQL and Java & Spring
  • Spring Framework 5.2: Core Container Revisited
  • JUnit 5: what’s new and what’s coming
  • Migrating a modern spring web application to serverless
  • Relational Persistence with Spring Data JDBC [Workshop]
  • Clean Architecture with Spring
  • How to secure your Spring apps with Keycloak
  • Boot Loot – up your game and Spring like the pros
  • Spring Boot with Kotlin, Kofu and Coroutines
  • Multi-Service Reactive Streams Using Spring, Reactor, and RSocket
  • Zero Downtime Migrations with Spring Boot

Apart from the conference, I am planning to visit Font Màgica de Montjuïc, which is near to the conference venue.

I’m open to further suggestions regarding my visit to Barcelona. What else should I visit? Is there any special food that I should try?