Open Positions
Job ID
Mandatory Skills
- Power BI Development
- SQL
- DAX Queries
- ETL – SSIS/SSRS/SSAS
No of Vacancy
Posted on
Responsibility:
- Managed BI solutions including portals, dashboards, and standard/ad-hoc reporting with a focus on data management and data warehouse architecture
- Evaluated existing data collection and analytics systems to identify improvements
- Built predictive models and machine learning algorithms
- Analyzed large datasets to identify trends and patterns
- Established procedures for secure and efficient data sharing with internal and external stakeholders
- Ensured the use of accurate and secure data extraction methods
- Provided technical direction and mentoring to developers working on enterprise data tools (Access, SSRS, SSIS, SQL Server, Power BI, Power Pivot)
- Ensured adherence to SDLC best practices, including planning and project estimation
- Maintained resource forecasting, resolved conflicts and risks, and ensured optimal resource utilization
- Designed and developed dimensions, hierarchies, and OLAP cubes
- Developed calculated measures using MDX
- Provided oversight and mentoring to the BI development team
- Maintained and enhanced technical competencies while promoting continuous learning and adoption of emerging technologies
- Achieved certifications and worked towards becoming a subject matter expert in key technologies
Skills Required:
- Minimum 5–7 years of experience with enterprise-class ETL/BI tools (Microsoft SQL, SSRS, SSIS, SSAS, Power BI)
- Experience working as a BI Consultant, Data Scientist, or in a similar role
- Strong knowledge of machine learning concepts and applications
- Hands-on experience with BI tools (Tableau, Microsoft Power BI) and data processing frameworks (Hadoop, Samza)
- Strong skills in statistics, reporting, and data mining
- In-depth analytical experience with a solid understanding of IT processes
- Experience with Azure Data Factory and Azure Integration Services
- Proficiency in software testing, report writing, and database management
- Experience in requirements gathering and documentation
- Hands-on experience in designing and developing SSIS packages
- Ability to mentor and guide junior team members
Job ID
Mandatory Skills
- Agentic AI
- LLM
- RAG
No of Vacancy
Posted on
Role Overview
We are seeking an AI Engineer with strong hands-on experience in Retrieval-Augmented Generation (RAG) and Agentic AI systems. The role focuses on building intelligent, secure, and scalable AI solutions that process unstructured documents, perform semantic matching, and automate enterprise workflows. Experience with LLM fine-tuning and modern LLM frameworks such as LangChain is essential.
Key Responsibilities
- Design and develop RAG-based systems for document intelligence and semantic matching
- Build Agentic AI workflows for request validation, reasoning, ranking, and action automation
- Implement document ingestion pipelines including parsing, chunking, embeddings, and indexing
- Develop vector search pipelines with metadata filtering and relevance scoring
- Perform LLM fine-tuning using techniques such as LoRA, PEFT, or supervised fine-tuning
- Integrate AI solutions with backend services, APIs, and email/notification systems
- Ensure security, access control, and compliance across AI pipelines
- Continuously improve retrieval accuracy, system performance, and reliability
Required Skills
- Strong Python programming skills
- Hands-on experience with RAG architectures and semantic search
- Experience building Agentic AI or multi-agent workflows
- Experience with LLM fine-tuning techniques (LoRA, PEFT, prompt tuning)
- Experience with at least one LLM framework: LangChain (preferred)
- Experience with vector databases such as FAISS, Pinecone, or Azure AI Search
- Strong understanding of embeddings and similarity-based retrieval
Good to Have
- Experience with Azure OpenAI, OpenAI APIs, or open-source LLMs
- Experience deploying AI services using FastAPI, Docker, and cloud platforms
- Knowledge of RAG evaluation and hallucination mitigation techniques
- Domain exposure in Life Sciences, Retail, or Insurance
What You’ll Work On
- AI-driven talent matching and recommendation systems
- Secure document intelligence platforms
- Agentic AI systems for workflow automation and decision support
Why Join Us
- Work on real-world, production-grade Agentic AI solutions
- High ownership and impact across enterprise use cases
- Opportunity to work with cutting-edge Generative AI technologies
Job ID
Mandatory Skills
- Architecture Design
- Expertise in PHP/.NET/Node.js/Java (any core web stack)
- Microservices Architecture
- Cloud platforms – AWS, Azure, GCP
No of Vacancy
Posted on
We are looking for a seasoned Technical Architect with a strong foundation in web and eCommerce application development. This role demands a technology leader who can drive scalable architecture, mentor developers, and ensure high-performing, secure, and user-centric platforms.
Key Responsibilities:
- Lead architecture design for large-scale web and eCommerce platforms (B2B/B2C)
- Define technical roadmaps and evaluate emerging technologies
- Collaborate with product owners, UI/UX designers, and developers to align system architecture with business goals
- Ensure scalability, performance, and security standards in architecture design
- Guide full development lifecycle from concept to Code reviews, technical guidance, and mentoring for development teams
- Implement DevOps best practices and CI/CD pipelines
- Document architecture decisions, standards, and processes
- Manage integrations with 3rd-party platforms, payment gateways, APIs, ERP/CRM tools
Technical Skills:
- Deep expertise in PHP/.NET/Node.js/Java (any core web stack)
- Strong understanding of frontend frameworks – React, Angular, Vue.js
- Hands-on experience with eCommerce platforms – Magento, Shopify, WooCommerce, SFCC, or custom
- Proficiency in RESTful APIs, microservices architecture
- Experience with cloud platforms – AWS, Azure, GCP
- Knowledge of database design – MySQL, PostgreSQL, MongoDB
- Familiar with CI/CD tools, version control (Git), containerization (Docker/Kubernetes)
Job ID
Mandatory Skills
- Technical Project Delivery
- Hands-on Coding
- Stakeholder Management
No of Vacancy
Posted on
About the Role
We are seeking a Technical Delivery Manager who will be responsible for end-to-end delivery of complex IT projects and solutions. The ideal candidate will combine technical expertise with leadership capabilities to manage cross-functional teams, ensure timely delivery, drive operational excellence, and exceed client expectations.
Key Responsibilities
- Own and manage the full delivery lifecycle of projects, from planning to deployment.
- Collaborate with stakeholders, product managers, architects, and technical teams to define scope, timelines, and deliverables.
- Act as the single point of contact for technical delivery, ensuring alignment between client needs and technical execution.
- Oversee project delivery teams and ensure adherence to SLAs, budgets, and quality standards.
- Identify and mitigate delivery risks, managing issues proactively to maintain project momentum.
- Drive Agile, Scrum, or other delivery methodologies as appropriate for the project.
- Coordinate with QA, DevOps, and Infrastructure teams to ensure seamless integration and deployment.
- Provide technical guidance and leadership to the delivery team.
- Regularly update clients and internal leadership on project status, risks, and milestones.
- Manage resource allocation, hiring, and performance reviews for project teams (where applicable).
Required Skills & Experience
- Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
- 10+ years of IT experience with at least 4+ years in a delivery/project management role
- Strong understanding of software development lifecycles (Agile, Scrum, Waterfall).
- Proven experience in managing complex technical projects in a multi-team environment.
- Hands-on experience with technologies such as .NET, Java, PHP, Cloud (AWS/Azure), DevOps tools, or relevant stacks.
- Strong communication and stakeholder management skills.
- Proficient in project management tools like Jira, Confluence, Azure DevOps, MS Project, etc.
- PMP, PRINCE2, Scrum Master, or SAFe certification is a plus.
Job ID
Mandatory Skills
- SQL Development
No of Vacancy
Posted on
Responsibility:
- Designing databases and ensuring their stability, reliability, and performance
- Develop best practices for database design and development activities
- Prepare documentations for database applications
- Test databases and perform bug fixes
- Improve application’s performances
- Memory management for database systems
- Fix any issues related to database performance and provide corrective measures
- Build appropriate and useful reporting deliverables
- Resolve and troubleshoot complex issues
- Data modeling to visualize database structure
- Can able to do Backup/Restore process
- Conduct Database Troubleshooting whenever required
- Work with project teams to support database development, implementation, and maintenance
- Do daily data consolidation activities based on the customer requirements
- Closely work with Database administrators to understand requirement
- Have expertise on performance tuning
- Good knowledge on High availability concept and Disaster recovery
- Have expertise in Replication and Database Snapshot
- Have knowledge on Execution plan reading and deeply knowledge in working with DMVs
- Have familiar work with teams and without supervision
Skills Required:
- Good written and communication skills
- Strong proficiency with SQL
- Experience with some of the modern relational databases
- SQL developers need extensive familiarity with data management principles and best practices in order to properly store, migrate, and structure data for multiple applications
- SQL developers are excellent problem-solvers and can isolate and resolve database issues and maintain access and data integrity
- Communication skills are important in this role because the SQL developer needs to gather requirements and specifications in collaboration with IT professionals and end-users while reporting on database issues
- Excellent analytical and organization skills
- An ability to understand front-end users requirements and a problem-solving attitude
- Expertise on ETL tools
- Expertise on SQL server reporting services, SQL server integration services, SQL server analytical services
Job ID
Mandatory Skills
- Python PySpark
- Azure
No of Vacancy
Posted on
Key Responsibilities
- Design, build, and maintain scalable data pipelines using Python & PySpark
- Develop and optimize ETL/ELT workflows for structured and semi-structured data
- Work with Azure data services to ingest, process, and store large datasets
- Write efficient SQL queries for data analysis, transformation, and reporting
- Collaborate with data analysts, data scientists, and business teams to deliver reliable datasets
- Implement data quality checks, monitoring, and performance tuning
- Handle batch and (if applicable) streaming data workloads
- Follow best practices for data security, governance, and documentation
Required Skills
Programming & Data Processing
- Strong hands-on experience with Python
- Solid expertise in PySpark / Apache Spark
- Experience with data transformations, joins, aggregations, window functions
Databases & SQL
- Advanced SQL (CTEs, indexing, query optimization)
- Experience with relational and analytical databases
Cloud & Azure
- Hands-on experience with Microsoft Azure, including:
- Azure Data Factory (ADF)
- Azure Data Lake Storage (ADLS)
- Azure Synapse Analytics
- Azure Databricks
- Understanding of cloud-based data architecture
Data Engineering Concepts
- ETL vs ELT
- Data modeling (Star / Snowflake schemas)
- Partitioning, caching, performance tuning
- Handling large-scale datasets
Good to Have
- Experience with streaming tools (Kafka, Event Hubs)
- Knowledge of CI/CD pipelines for data workflows
- Exposure to Airflow / Azure Data Factory scheduling
- Understanding of DevOps / Git / version control
- Basic knowledge of data governance & compliance
Education
- Bachelor’s degree in Computer Science, Engineering, or related field(Equivalent practical experience is welcome)
What We Offer
- Opportunity to work on large-scale, real-world data platforms
- Exposure to modern cloud & big-data technologies
- Collaborative and growth-focused environment
Didn’t find a suitable opportunity today?
Introduce yourself. Help us help you.
We’re always on the lookout for good talent. Let us know who you are and what your dream job looks like. We will reach out to you when the right role comes up.