About Vladam
We have created Vladam software solution with aim to deliver a seamless trading experience to crypto-traders with desirable return on investment while maintaining predictive volatility. Our solution was operable in production serving hundreds of users for 12 months. As we have not obtained the license for provisioning the services related to virtual currencies we have decided to list the Vladam software solution for sale with a hope that Your company can continue the good work we started.
We believe that AI can identify market patterns and trends beyond human comprehension by data mashup of off-chain, on-chain, and trading data.
The team behind VLDAM consists of professionals in Artificial Intelligence, FinTech, and, most importantly - traders.
With Vladam You get an entirely usable, developed and production tested solution ready to change your market without hidden costs.
You can check the look of the Vladam software solution in the following photos and by trying out demo of our platform for free.
More about Vladam software stack
Vladam Frontend Web Application has been implemented in Angular 12 with multilingual support, server-side rendering, SEO optimization, blog support, and a multimedia searchable FAQ section covering all user-oriented software aspects. The code has been implemented to follow state-of-the-art principles in coding static web applications with the MVC model. It's a responsive web application with implemented support for tablet and mobile screens.
All APIs, workflows, and user stories are well documented. This also includes Figma wireframes.
Vladam Backend Application has been implemented using Python 3.7 with REST API first approach. The Backend Application is based on the FastAPI framework leveraging SQLAlchemy ORM for database access and alembic for database migrations. Vladam database of choice is PostgreSQL. The backend application provides support for two types of users: individual users and organizations, allowing your option to support both individual and business clients on the platform.
The Backend Application uses JWT authentication and also includes support for two-factor authentication using authenticator apps.
In addition to user management, the backend application includes APIs for CRUD operations relative to API keys, Billing Profiles, User profiles, two-factor Authentication, Active Trading Strategies, Historic Trading strategies, Supported Trading Strategies, Followed Trading Strategies, Best performing Strategies, Risk Management, etc.
Finally backend Application includes REST API implementation for integration with Bitmex, Binance, ByBit, FTX, and CoinbasePro exchanges, allowing users to add their API keys for a specific third party exchange, checking the validity of the API keys, setting up trading strategies, etc.
This software component is in charge of submitting orders on behalf of end-users in line with generated trading signals, verifying the status of orders, and canceling orders. The Order Submission Engine has been implemented in Python 3, leveraging multiprocessing, multi-threading, queues, etc. This also includes REST API implementation for integration with Bitmex, Binance, ByBit, FTX, and CoinbasePro exchanges. It is designed in a way to be deployed on different multiple servers (closest to specific exchange servers to increase response time) based on exchange or even a digital asset. In comprises trading bots for different types of trading strategies, trading pairs, and exchanges. The component also partially covers the special event detection as it allows automatic suspension of trading strategies for which an irregularity is detected during automated trading (such as invalid API key permission, etc), eliminating any strategies that might cause temporal blacklisting of servers used for order execution by third-party exchanges.
The components are followed by detailed workflow documentation.
The Special Event Detection Engine is essential for proper user, payments, and trading portfolio management on the VLADAM platform. The main functionality of this engine is to detect any abnormalities that would interrupt the correct functioning of the above-mentioned VLADAM components. The two main functionalities include:
A. Ensuring that the VLADAM platform does not violate the code of conduct set by 3rd party exchanges.
B. Ensuring that VLADAM platform users do not manually interfere with VLADAM order execution (violate VLADAM rules of engagement)
The engine comprises a set of system-wide checks and cron-jobs (periodically executed code) that are repeatedly launched in order to detect deviations from normal behavior inside the components of the two above-mentioned functionalities.
This component is based on Python 3.7 and is also documented in detail.
The main functionalities of the Reporting and Back Testing Engine are:
1. monthly generation of results for all supported strategies on the VLADAM platform. These include flat-short, flat-long, and short-long trading strategies based on trading signals, accounting for trading fees, slippage, and funding in the case of derivatives. For each specific exchange (Binance, Bitmex, CoinbasePro, FTX, ByBIT) there is a set of scripts for data pooling and trading strategy backtesting. The backtesting results are then easily imported to the platform database.
2. generation of best-performing strategies.
The engine has been implemented in Python3.
The main functionalities of the Invoicing and Payment Engine are:
- providing all relevant information about processed orders (executed trades) for a specific active or historic trading strategy
- exporting processed orders in CSV format
- generating invoices
- invoice management and export in pdf format
- providing information about the performance of active and historic strategies individually as well as the overall performance on a user’s account basis
- providing APIs for paying invoices and notifications about payment transactions
The engine has been implemented in Python3 using the Fast API framework. In addition to a set of synchronized API calls, the engine also includes cronjobs and async tasks for automated invoice generation, reporting, and exporting of CSV files upon request.
The invoicing has been implemented to account for achieved profits in line with the High-Water mark approach.
The Data Acquisition Engine collects relevant data in real-time which are required and fed to the Trading Signals Generation Engine for hourly signal generation. The Engine has two main components:
- Cryptocurrency trades component – Data acquisition from relevant exchanges related to cryptocurrency trading with main functions being:
- Collecting executed orders in real-time from multiple exchanges
- Processing executed orders to create Open, High, Low, Close, Volume, and Typical Price with 1-minute granularity
- Processed and raw data are stored in the MySQL database
- Social Media component - Acquisition of the data from social media feeds related to cryptocurrency trading with main functions being:
- Collecting Twitter data in real time based on:
- Identified keywords
- A list of users (crypto influencers)
- Performing sentiment analysis on the collected data and storing pre-processed sentiment score with 1 minute granularity
- Collecting Twitter data in real time based on:
The engine has been implemented in Python 2 programming language.
The engine collects trading data in real time from more than 10 different cryptocurrency exchanges using web socket technologies.
Trading Signals Generation Engine comprises the set of trained AI models for generating trading signals for a specific strategy and trading pair. The pre-trained AI models are based on Deep Reinforcement Learning. Deep reinforcement learning represents a crossover between deep neural networks and agents-based reinforcement learning. Unlike traditional AI models that are trained on datasets pre-annotated with trading signals made by a professional trader, reinforcement learning refers to a trial-and-error approach. This is a novel category of machine learning where intelligent machines, or so-called agents, can learn from their actions similar to the way humans learn from experience. An agent in this framework will try a huge amount of different action sequences for a given time frame. Based on the success of its actions, an agent is rewarded or penalized. Thus, the Deep reinforcement learning model learns by itself when to buy, sell or sit and this learning is based on previous experience alone.
For any given hour, our trading models make a decision about the action based on previous K hours (K is our engineering secret). Thus, the data resampling component will feed the running AI model with the previous K hours with hourly calculated financial features. Not every AI model we have uses the same financial features - they are selected per cryptocurrency, based on extensive research that led to the best return on investment.
As mentioned above, the AI model can generate three actions: to buy, sell or sit. For every hour, all of these actions are accompanied by probabilities or the “strength” of the signal. The higher the probability, the more certain the AI model is in the fact that this action is the right one. Typically, the deep reinforcement learning model would choose the highest probability and take the action tied to it.
Each trained AI model can generate signals for multiple supported trading strategies i.e., the same signal for Ethereum can be used for trading on Binance and Coinbase Pro.
The trained AI model is loaded within a Trading Signals Generation Engine which fetches the preprocessed data gathered by Data Acquisition Engine in real-time, extracts the set of features from the data, and feeds the trained AI model with the extracted features for decision making. The result (trading signal) is stored in the VLADAM relational database and used by the Order Submission Engine to submit the execution of real-time trades to third-party exchanges for all active trading strategies that rely on the specific trading signal generator.
The Trading Signal Generation Engine is based on Python 3. It includes trading signal generators for real-time/production signal generation, testing trading signal generation with historic data, ensemble approach for combining multiple individuals AI models into a single ensemble model, etc.