

Job Responsibilities
- Design, build & test ETL processes using Python & SQL for the corporate data warehouse
- Inform, influence, support, and execute our product decisions
- Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
- Evaluate and prototype new technologies in the area of data processing
- Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
- High energy level, strong team player and good work ethic
- Data analysis, understanding of business requirements and translation into logical pipelines & processes
- Identification, analysis & resolution of production & development bugs
- Support the release process including completing & reviewing documentation
- Configure data mappings & transformations to orchestrate data integration & validation
- Provide subject matter expertise
- Document solutions, tools & processes
- Create & support test plans with hands-on testing
- Peer reviews of work developed by other data engineers within the team
- Establish good working relationships & communication channels with relevant departments
Skills and Qualifications we look for
- University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
- 4 - 6 years experience with data engineering.
- Strong coding ability and software development experience in Python.
- Strong hands-on experience with SQL and Data Processing.
- Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
- Good working experience in any one of the ETL tools (Airflow would be preferable).
- Should possess strong analytical and problem solving skills.
- Good to have skills - Apache pyspark, CircleCI, Terraform
- Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
- Understanding & experience of agile / scrum delivery methodology

Similar jobs

At Palcode.ai, We're on a mission to fix the massive inefficiencies in pre-construction. Think about it - in a $10 trillion industry, estimators still spend weeks analyzing bids, project managers struggle with scattered data, and costly mistakes slip through complex contracts. We're fixing this with purpose-built AI agents that work. Our platform can do “magic” to Preconstruction workflows from Weeks to Hours. It's not just about AI – it's about bringing real, measurable impact to an industry ready for change. We are backed by names like AWS for Startups, Upekkha Accelerator, and Microsoft for Startups.
Why Palcode.ai
- Tackle Complex Problems: Build AI that reads between the lines of construction bids, spots hidden risks in contracts, and makes sense of fragmented project data
- High-Impact Code: Your code won't sit in a backlog – it goes straight to estimators and project managers who need it yesterday
- Tech Challenges That Matter: Design systems that process thousands of construction documents, handle real-time pricing data, and make intelligent decisions
- Build & Own: Shape our entire tech stack, from data processing pipelines to AI model deployment
- Quick Impact: Small team, huge responsibility. Your solutions directly impact project decisions worth millions
- Learn & Grow: Master the intersection of AI, cloud architecture, and construction tech while working with founders who've built and scaled construction software
Your Role:
- Design and build our core AI services and APIs using Python
- Create reliable, scalable backend systems that handle complex data
- Work on our web frontend using ReactJs
- Knowledge Redux, ReactJs, HTML CSS is a must
- Help set up cloud infrastructure and deployment pipelines
- Collaborate with our AI team to integrate machine learning models
- Write clean, tested, production-ready code
You'll fit right in if:
- You have 2 years of hands-on Python development experience
- You have 2 Years of hands-on ReactJs Developement experience
- You're comfortable with full-stack development and cloud services
- You write clean, maintainable code and follow good engineering practices
- You're curious about AI/ML and eager to learn new technologies
- You enjoy fast-paced startup environments and take ownership of your work
How we will set you up for success
- You will work closely with the Founding team to understand what we are building.
- You will be given comprehensive training about the tech stack, with an opportunity to avail virtual training as well.
- You will be involved in a monthly one-on-one with the founders to discuss feedback
- A unique opportunity to learn from the best - we are Gold partners of AWS, Razorpay, and Microsoft Startup programs, having access to rich talent to discuss and brainstorm ideas.
- You’ll have a lot of creative freedom to execute new ideas. As long as you can convince us, and you’re confident in your skills, we’re here to back you in your execution.
Location: Bangalore
Compensation: Competitive salary + Meaningful equity
If you get excited about solving hard problems that have real-world impact, we should talk.
- All the best!!
Responsibilities;
● Analyze financial data related to loan requests.
● Evaluate loan documents are to ensure accuracy and completeness. ● Perform risk assessments on potential loan recipients based on credit rating, borrowing history, and other specific risk factors.
● Record loan denials and specific basis for declining application.
● Monitor and report noncompliance with loan covenants.
● Maintain client relationships and provide superior customer service.
● Assist with technical underwriting issues and questions.
● Review loan documentation and vendor reports to identify signs of fraudulent activity.
● Develop and implement underwriting policies and procedures
Company Description
Tayana Solutions is a leading ERP service provider based in Bhopal. We specialize in offering customized ERP solutions and establishing long-term relationships with our clients. As a gold-certified Acumatica partner, we provide top-notch customer support and consulting services. Our team of engineers stays updated with the latest technology trends to develop solutions that meet our clients' needs.
Role Description
This is a full-time on-site role for a Recruiter. The Recruiter will be responsible for full-life cycle recruiting, hiring, and recruiting. They will also be involved in various recruitment activities and will communicate with candidates and hiring managers.
Qualifications
Experience in full-life cycle recruiting, technical and non-technical recruiting, and hiring
Experience Required: 2 to 5 Years of experience
Budget for the profile: 1 to 3 LPA
Strong communication and interpersonal skills
Ability to build and maintain relationships with candidates and hiring managers
Excellent organizational and time management skills
Experience in the IT industry is a plus
Bachelor's degree in Human Resources, Business Administration, or related field
Office Location: Lalita Nagar, Kolar Road, IN front of Amra Villa Bhopal
Industry
Information Technology & Services
Employment Type
Full-time

We are a technology company, at a stage where we are moving fast ahead. We have been enabling enterprises to derive the value of Data Science and technology. At Lumiq, we are thinking every day about how to use the latest technology and concepts to build products which can help our customers and enterprises in all aspects of their businesses.
We create Data oriented enterprise applications. The applications must be scalable and robust. This position creates the exact kind of enterprise applications which are easy to configure and manage from the end-user perspective. The integration with all the different services and APIs plays an important part in the whole architecture.
Eligibility
Experience and Qualification
- 2-5 Years of experience with Drools/Kofax Workflow.
- Java Microservices: 2-3 years Spring Boot, Spring Integration or Batch.
- Working knowledge of Databases
- Design and develop high-volume, low-latency applications for critical systems, delivering high-availability and performance.
- Write well designed testable, efficient code, and ensure that the designs comply with specifications.
- Bachelor's Degree in Computer Science or equivalent
Must Have Skills
- Java
- Drools
- Any Relational Database
- GIT
We are looking for candidates who are smart and ambitious. The ideal candidate will have experience in all stages of the sales cycle. They should be confident with building new client relationships and maintaining existing ones. There will be Performance based incentives for those who are self driven and motivated to achieve results and grow with us.
Qualifications
-
Bachelor's degree 3+ years in Business, Sales, Marketing or Technology
-
Minimum of 6 months - 2 year+ experience in sales, business development or similar role
-
Experience in full sales cycle including deal closing
-
Demonstrated sales success
-
Strong negotiation skills
-
Strong written and verbal communication and presentation skills
-
Proficient in working on excel, powerpoint, googlesheets
-
Knowledge of generating leads through LinkedIn, Twitter and others
-
Experience in international sale and IT services would be preferred
Responsibilities
-
Build relationships with prospective clients
-
Oversee the sales process to attract new clients.
-
Work with senior team members to identify and manage risks
-
Research and identify new market opportunities
-
Attending networking events to attract and retain clients
-
Developing and executing sales and marketing strategies to grow business
-
Prepare and deliver pitches to potential investors
-
Analyze market and establish competitive advantages
-
Track metrics to ensure targets are hit
-
Foster a collaborative environment within the organisation




DevOps Engineer Skills Building a scalable and highly available infrastructure for data science Knows data science project workflows Hands-on with deployment patterns for online/offline predictions (server/serverless)
Experience with either terraform or Kubernetes
Experience of ML deployment frameworks like Kubeflow, MLflow, SageMaker Working knowledge of Jenkins or similar tool Responsibilities Owns all the ML cloud infrastructure (AWS) Help builds out an entirely CI/CD ecosystem with auto-scaling Work with a testing engineer to design testing methodologies for ML APIs Ability to research & implement new technologies Help with cost optimizations of infrastructure.
Knowledge sharing Nice to Have Develop APIs for machine learning Can write Python servers for ML systems with API frameworks Understanding of task queue frameworks like Celery
Nice to have: python
Must frameworks and technologies: Springboot, kafka, MQTT, docker/kubernetes, REST APIs
Persistence layer: MongoDB, Elastic Search, Any GraphDB (Neo4j/Arango), SQL, HBase
Must have: Exposure in large scale architecture (Concept of queues, micro services, functional programming)
Must have: Strong Data structure and design principles
Expert in developing Node.js applications, Strong understanding of NPM and modular application development skills building, Proficiency and hands-on experience with Node.js, Express, Sockets, MongoDB/Elasticsearch/Redis/MySQL, Apache Kafka/Google PubSub, Experience of working in MEAN Stack is a plus)

