AI Infrastructure

3D model creation is highly dependent on the data used for training. At 3DFY.ai, we developed a data-centric infrastructure and related methodology, which are designed for rapid adaptation to new applications and use cases.  

The overall pipeline comprises four main building blocks: input module, data engine, core computational pipeline, and output validation.

 
1. Input module

3DFY.ai operates on any type of image format without making assumptions regarding the acquisition setup. In particular, 3DFY.ai can handle images of complex scenes taken with arbitrary lighting conditions and camera setups.

The 3DFY.ai computational pipeline is designed to make the most out of existing data and can utilize any number of input images for a given object. In fact, the only requirement is that the object to 3DFY appears in all input images.

The input module can process images from various sources, including network drives, databases, and even directly from websites.

2. Data engine

The 3DFY.ai data engine is designed to efficiently generate and handle the different types of data needed to train our algorithmic pipeline. Our core AI models are trained using synthetic high-quality 3D models, which are typically slow to create and expensive to accrue.

Therefore, at the core of our data engine is the capability to procedurally generate additional 3D models in a highly automated manner to create new datasets and enrich existing ones. This greatly facilitates improving model performance in a cost effective way.   

Processing large 3D datasets is a burdensome computational task. 3DFY.ai mitigates this by creating native infrastructure that distributes the computing over large clusters of cloud machines, enabling large dataset preparation at a fraction of the time and cost.

3. Computational pipeline
The 3DFY.ai computational pipeline consists of multiple DL models that we designed, developed and optimized to carry out multiple processing stages, which are linked together using computer graphics knowhow.
Since model training is computationally intensive, 3DFY.ai developed an in-house, cloud-based training infrastructure that distributes the computational load across many machines, significantly reducing the training time and cost.
4. Validation module
This module closes the loop between the data and the AI models and is driven by client specifications. Some of the DL models may fail validation after the first training iteration. This is expected, as the initial training dataset is usually curated with speed and cost-effectiveness as a priority.
Whenever a module fails, the failure modes are analyzed and additional data is procedurally generated to mitigate those shortcomings. The module is then re-trained and re-validated, a process that is repeated until the criteria defining success are met.
loading