How to achieve safe operation of online services
- Home
- Blog
- Cloud Services
- How to achieve safe operation of online services
I am Volodymyr Melnyk, the Technical Director of Tucha. I want to share with you an approach for creating secure work of online services particularly speak about their protection from external threats. I mean not only protection from unauthorized changes of web-applications code, but also other aspects related to security. So, let us have a look at ways of counteraction to attempts to exploit application vulnerabilities, which can be used by an intruder to gain unauthorized access to certain data, its theft, substitution, or destruction.
But first I will turn on Captain Obvious mode and focus on some well-known postulates.
Errors in a web application code often allow an attacker to exploit them to force the application to perform actions that are beneficial to the intruder and unfavorable to a provider or other users of an online service that the application provides.
As of today, there is no reliable way to ensure that there are no errors in software code. Also, there is no guaranteed way to ensure that they cannot be found and exploited as well.
By definition, the Internet cannot be considered a secure environment. Automated vulnerability scanners constantly search through all available addresses to find weak spots in services protection, that respond to these addresses. This practice is effective enough, otherwise, it would not be used. Also, popular services draw a lot of enthusiasts` attention, which focuses on finding vulnerabilities, the use of which could give them interesting results.
However, the number of online services on the Internet is growing, and this growth rate is also increasing. One of the vital issues is how to protect data stored on servers constantly connected to the Internet from attacks.
There are quite a lot of practices aimed at reducing hacking risks, but in order not to try to boil the ocean, we will focus attention on two or three aspects.
- Identification of vulnerabilities of application publication by automated testing (after any code changes).
- Protection against exploiting vulnerabilities using a web application firewall. It detects potentially dangerous patterns in user behavior and blocks potentially dangerous requests.
- It bears repeating the advantages of microservice architecture. It allows us to minimize the risks of access escalation when hacking a microservice that itself does not have such access.
Are they effective? Yes.
Are they used everywhere? No.
What can prevent you from applying these practices? The answer is laziness and optimism.
"Our web applications are already protected quite well," is probably the most harmless assumption that could be made. "We write code without errors," perhaps, even more dangerous, and absurd.
Of course, as I mentioned at the beginning, there are no tools that can help completely avoid errors in development, and there are no completely reliable methods of protection from their exploitation. However, the more attention you pay to these aspects, the lower probability that a web application will be broken.
And I suppose that a platform with such initially built-in functions could help. Let us imagine what it could be.
This platform allows you to automate processes related to the application lifecycle, ensure the operation of these applications, and store their data in the computing cloud. To be reliable enough, it provides distributed computing and distributed data storage.
Security and stability are important to us, so using this platform means placing application copies in isolated containers, running simultaneously on multiple physical nodes, and receiving equal shares of traffic from the load balancer, which dispatches requests from clients. Kubernetes orchestration system manages the containers. This system is becoming more and more popular among Ukrainian DevOps engineers. Also, global trends already allow us to talk about it as a new industry standard.
Speaking of DevOps, it is also worth mentioning that the platform provides automatic execution of operations related to application testing, compilation, and containerization. The platform allows you to load the application code to the Git repository, after that each time you make changes to the code (or rather, each time you Commit these changes to the repository) compilation, testing, and packaging processes are automatically started. Then the platform creates an image of the container that goes to the private repository. The modified version of the application is published first in the test environment so that QA engineers (testers) can check its performance. When the decision about the new release is made, it is carefully delivered to the productive environment, The orchestration system first launches the new version of the program, makes sure that it is working, and only then switches client traffic to it.
Using the platform means using containers, completely isolated environments where along with an application, only the most necessary components for its operation are available. When the platform needs to launch the application, it creates new containers (not one, but several identical), deploying them from the image that is available to it in the repository. Whatever happened inside the container, these changes will not go to the repository. Containers are constantly created and destroyed. The application will not be able to break and add any bookmarks to the code. Why? First, all the program files are tightly write-protected. Second, even if the changes were made somehow, all these files are only a copy of the application. They are in the temporary ephemeral container, which sooner or later will still be destroyed.
At the same time, the platform allows you to analyze every request received from the Internet for potential threats. Before being processed by the load balancer, the request passes through the web application firewall, which determines whether it should be allowed. The firewall checks request based on the signature database that is constantly updated, and a customer can either open certain checks or add their own.
This year, we launched TuchaKube platform, which provides a range of features. Usually, independent implementation of such functions requires a long and scrupulous work, understanding of many fundamentally new concepts, as well as a painful search for an answer to the question, "Why do I need this?" And these functions are not just about security. The platform provides things like:
- monitoring a huge number of metrics;
- automatic horizontal scaling (by creating the desired number of identical containers on different computing nodes, considering the current load)
- automatic generation of certificates for TLS connections;
- automization of DevOps functions.
Since last year, thanks to the wishes of some of our company's partners, we have faced the need to solve the tasks of providing CI/CD processes automatization. This led us to heavy use of Docker containers, and then to the use of the Kubernetes orchestration system. Having accumulated enough practical experience, we decided to systematize this experience and obtain additional benefits from it for everyone. This is how the revolutionary Tuchakube platform appeared.
If you have any questions or tasks for us, feel free to contact us. You are welcome 24×7!