Microservices are not new—they have been around for many years. For example, Stubby, a general purpose infrastructure based on RemoteProcedureCall (RPC), was used in Google data centers in the early 2000s to connect a number of services with and across data centers. Its recent rise is due to its popularity and visibility. Before microservices became popular, monolithic architectures were mainly being used for developing on-premises and cloud-based applications.
A monolithic architecture allows the development of different components such as presentation, application logic, business logic, and Data Access Objects (DAOs), and then you either bundle them together in an Enterprise Archive (EAR) or a Web Archive (WAR), or store them in a single directory hierarchy (such as Rails or Node.js).
Many famous applications, such as Netflix, have been developed using a microservices architecture. Moreover, eBay, Amazon, and Groupon have evolved from monolithic architectures to microservices architectures.
Now that you have had an insight into the background and history of microservices, let's discuss the limitations of a traditional approach—namely, monolithic application development—and see how microservices would address them.