iDAAS-Connect-Cloud specifically is ONLY intended to deal with enabling iDAAS to all sorts of public cloud connectivity. The vendors currently supported are AWS, Azure and GCP.
We are focusing on continuing to improve. With the numerous implementation and partner implementations we have focused on success overall and as we progress forward the intent is to focus on success while being consistent. Please find details on how to help us here.
For any repository to be implemented there are two types of requirements, overall general requirements and then there are specific submodule requirements. For general pre-requisites can be found here, near the bottom in the Platform General Pre-Requisites section.
The following section is designed to cover the details around implementing.
Within each submodule/design pattern/reference architecture in this repository there is a specific README.md. It is intended to follow a specific format that covers a solution definition, how we look to continually improve, pre-requisities, implementation details including specialized configuration, known issues and their potential resolutions. However, there are a lot of individual capabilities, we have tried to keep content relevant and specific to cover specific topics.
- For cloning, building and running of assets that content can be found here.
- Within each implementation there is a management console, the management console provides the same interface and capabilities no matter what implementation you are working within, some specifics and and details can be found here.
As of the time of this content publication there are no known specific issues. The ONLY consistent common issue is setting the application.properties before running the application.
The following section is intended to cover specific implementation known issues, challenges and potential implementation details.
This repository follows a very common general implementation. The only connector currently in this code base is a Kafka topic. The key sceanrio this can demonstrate is data being processed from a data science kafka topic.
- The Kafka client connects to a particular broker and topic and checks if there is any data to process.
- If there is data it will audit the transaction processing
- The transaction will be routed for processing within iDAAS KIC
Supported properties include:
# Server - Internal
server.host=9983
# Kafka
kafkaBrokers=localhost:9092
# JDBC Database
spring.datasource.url=jdbc:mysql://localhost/idaas
#jdbc:postgresql://localhost:5432/idaas
spring.datasource.username=idaas
spring.datasource.password=@idaas123
#spring.database.driver-class-name=com.mysql.cj.jdbc.Driver
#org.postgresql.Driver
Happy using and coding....