
Artificial Intelligence & Machine Learning Application
Our company is a platform for creating data products enabled by AI, and we want to leverage the latest technology to gain market advantage. Applications of this technology to build data products include recommendation algorithms, predictive analytics like lifetime values, fraud detections, cyber security analytics and classifications. We would like to collaborate with students to apply the latest artificial intelligence (AI) and machine learning (ML) techniques to our existing dataset. Students will develop an AI / ML model related to any of the aforementioned applications. This will involve several different steps for the students, including: Conducting background research for specific problem domains (Will provide at the beginning of project) Analyzing the dataset (identify data requirements) Researching the latest AI / ML techniques and how they could be applied to our data. Developing an AI / ML model that provides unique outcomes or insights into our data. Providing multiple solutions that can be applied to solve the same problem. (Key is to build a "deployable" solution

Backend AI API development and third party API Integration
Preeminent Technologies is Fostering Technological Advancements in Global Travel Sector through Building Innovative Solutions in Canada. We are building an AI, Big Data Analytics, LLMs/Machine Learning enabled traveler-centric (comprehensive enterprise commercial) travel planning and travel booking platform. Segment 1: As back-end developers, the intern resources will be responsible for developing and maintaining cloud-based azure functions, backend-APIs using .Net 8, C#, Azure, and Enterprise APIs. These resources will collaborate with cross-functional teams to design, develop, and implement scalable, secure and cloud hosted solutions. Additionally, the resources will participate in code reviews, troubleshoot system issues, and contribute to continuous improvement initiatives. Segment 2: We are developing programs and training automations in analyzing AI datasets, OpenAI, conversational and generative AI models, building automation for AI and LLM/ML model training, fine-tuning AI and LLM/ML models, and running automated test processes to validate results against real-time data, price, availability, and other parameters. The machine-learning problem we are addressing revolves around optimizing search processes within the travel domain, particularly when dealing with datasets sourced from multiple providers. I wanted to reach out to explore potential collaboration opportunities for us to support development activities and would welcome the opportunity to connecting with your team on this initiative to discuss further details. Required Expertise Skills: The ideal candidates will have strong background in software development, cloud platforms, SDLC concepts, clean architecture approach, and possess the technical skills necessary to integrate various APIs and modern technologies seamlessly. We are looking to fill in the role of technical intern resources as intermediate backend developers, will be responsible for developing and maintaining cloud-based application using .Net 8, C#, Azure OpenAI service, and Enterprise API integrations.

Synthetic data project framework
Our company is interested in creating frameworks / templates for our pilot projects with social impact clients. Impact of this work is to maintain efficient operational execution in how we structure our client work. We would like to collaborate with students to provide cohesive, appropriate details relative to the pilot project scaffolding from our Data Scientists. Students will write technical statistical documentation on methodologies and write code (likely in Python, using Jupyter and/or Marimo notebook) so we can achieve a fulsome pre-pilot understanding of the workflow involved. This will involve several different steps for the students, including: Working with Data Scientist guidance to write code that assesses initial data type, structure, etc, especially on determining appropriate analytical tooling for client data use case categories Expanding on baseline Data Scientist framework document for synthetic data 'utility' metrics, i.e. how useful to a given statistical model the synthetic data is Completing all above in analytical objectives including but not limited to dimension reduction, relationship analysis, and clustering