Skip to content

Hiring our 2nd batch for Finland: Remote AI/ML Engineers

Hiring our 2nd batch for Finland: Remote AI/ML Engineers

Apply here: https://lnkd.in/dKASSCEK

and join our team at Talent Bridge Finland as we build the SUSAN.fi platform

_______________________________________________

🔧 WHAT YOU’LL DO

LLM SYSTEMS & AGENTS

Design and implement modular, scalable LLM-based chatbot systems (OpenAI, Claude, Mistral, etc.)

Support voice + text multimodal interfaces

Architect multi-LLM pipelines with dynamic switching and configuration

Build production-grade RAG systems with optimized vector databases

Develop advanced function/tool calling and multi-step agent flows

Optimize latency, inference cost, and hallucination reduction

BACKEND & INTEGRATION

Collaborate with backend teams on async architectures (queues, events, webhooks)

Build knowledge integration layers (SQL/NoSQL/graph-based)

Own AI-serving architecture (load balancing, batching, worker pooling)

DOCUMENT UNDERSTANDING & ETL

Build scalable ETL pipelines for structured/unstructured data

Develop OCR, entity extraction, and semantic search pipelines

Automate document #ingestion (PDFs, scanned docs, emails, enterprise knowledge)

Apply embedding + $metadata strategies for retrieval optimization

CLOUD & INFRASTRUCTURE (GCP-FOCUSED)

Architect and deploy pipelines on Google Cloud Platform

Hands-on with BigQuery, Pub/Sub, Dataflow, Vertex AI, Cloud Run

Implement secure, cost-optimized governance and monitoring

Manage CI/CD for AI workloads, containerized deployments, Kubernetes/GKE

_______________________________________________

🎯 WHAT WE’RE LOOKING FOR

Apply here: https://lnkd.in/dKASSCEK

Core AI/ML

2–3+ years building production AI/ML systems

Strong in $Python + async frameworks (FastAPI preferred)

Deep knowledge of LLMs, embeddings, prompt chaining, few-shot learning

Hands-on with LangChain / LlamaIndex / RAG frameworks

Proficiency in vector DBs (Pinecone, Qdrant, Weaviate, Chroma)

Experience orchestrating agentic workflows (LangGraph, CrewAI, AutoGen)

Familiarity with multimodal models (GPT-4o, Claude, Gemini, vision/speech models)

Document Understanding & Data Engineering

Experience with ETL pipelines & large-scale ingestion

OCR, NLP, entity extraction, classification expertise

Strong data modeling & schema design

Cloud (GCP a must-have)

Advanced GCP experience (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Functions)

Production experience in Document AI & workflow orchestration (Composer/Airflow)

Skilled in cost optimization, monitoring, observability

_______________________________________________

BONUS SKILLS

SQL/NoSQL (PostgreSQL, MongoDB, Redis)

Async task serving (Celery, Kafka, Redis Streams)

Familiarity with AI evaluation methods (hallucination, grounding, safety metrics)

Dockerized deployment & multi-cloud familiarity

_______________________________________________

Apply here: https://lnkd.in/dKASSCEK