50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
- Work with Azure DevOps to manage CI/CD pipelines, automation, and cloud services.
- Hands-on experience with Microsoft Azure cloud services.
- Develop and maintain Azure DevOps pipelines for continuous integration and deployment.
- Implement automation solutions using Python to streamline processes and workflows.
- Perform minor debugging on Node.js and React Native (can be learned on the job).
- Assist in MLOps operations (beginner-level experience is acceptable or can be learned on the job).
Who are we?
Kriyadocs is a leading document workflow SaaS platform focused on the publishing industry. Technology is at the core of our evolution – we’ve consciously striven to always stay ahead of the curve in its adoption to provide best-in-class capabilities for our clients and our employees. This ethos is reflected in our vision and mission.
Our Vision: To make publishing all content as simple as clicking a button and become the partner of choice for individuals and organizations looking to share knowledge.
Our Mission: Provide a fantastic experience to authors, content publishers and our own employees through technology and innovation, by publishing high-quality content seamlessly and quickly. We deliver Happy Authors and Happy Employees.
What will you be doing?
- Leading and managing our Customer Support team with regular 1-1s and identifying the training needs.
- Looking for more technical support person with troubleshooting and product bug fixing experience.
- Coaching the team to improve and striving to give our customers the best experience.
- Planning and implementing customer support strategy whilst leading by example providing first class customer support.
- Brainstorming and implementing process improvements to increase efficiency in customer service operations.
- Measure and enhance customer experience with data-led decisions and tracking KPIs.
- Coordinating with CS/Product/Engineering teams to ensure that escalated customer support issues are resolved quickly and efficiently.
- Leverage technical support skills, including writing scripts and basic bug fixes, to assist customers effectively.
What are we seeking in you?
- 5+ years of experience in Customer Support for a B2B SaaS platform company.
- You are obsessed with delivering exceptional customer experiences, creating and implementing strategies backed up with metrics.
- You understand operational processes and how to build teams and reporting structures.
- You have a rapid ability to learn & teach others.
- You have strong written and verbal communications.
- You have excellent attention to detail, alongside the ability & willingness to work quickly.
- Technical support experience, including scripting and basic bug fixes, is a must.
- If you have knowledge of XML, XSLT, SQL Javascript and/or experience in publishing domain, it would be an added advantage.
This is a work from office opportunity, and you will be working out of our office in Chennai.
What is it really like to work here?
At Kriyadocs, every Kriyator is driven by our culture at the core to
· Deliver Excellence - Deliver Delight
· Stay Curious - Stay Driven
· Dream Big - Rise Together
You could also be a Kriyator, if you are
· Fearless in taking on challenges
· Focused on learning, demonstrating new skills and working towards successful outcomes
· Fanatical in taking pride and responsibility in all your work
Why should you join us?
· Industry Leading Product - We are the leading platform in our space and have several large global brands as our customers.
· Create an impact - We give you the environment to transform your ideas into reality and create fantastic experiences for our customers.
· Budding & Agile team - We are a growing team with love for learning, constant quest for quality and are outspoken about ownership.
As Customer Support Lead at Kriyadocs, you will be at the forefront of ensuring our customers receive top-notch support and technical assistance. You will manage and lead our Customer Support team, focusing on both traditional support and technical problem-solving. The candidate should be a self-starter, a good collaborator, and must have a bias for action. They should be comfortable with ambiguity and the challenges within a growing startup. If this excites you, we want to talk to you!
Position name: SAP Technology Specialist L2
Location: Mumbai Andheri East, In Office
Shift: Rotational Shift (Hybrid-2 days a week-Wed & Thursday)
Role Responsibilities:
- Independently handle complex tasks like system, refreshes, DB upgrades, SAP and DB maintenance tasks, support pack upgrades, performance analysis and troubleshooting.
- Administer the SAP database (perform database upgrades, apply database maintenance, administer database performance, manage database storage, database problem determination and resolution.
- Provide technical and customer support and monitor system performance.
- Perform SAP client administration (create client, copy client, delete client, export/import client).
- Handle application alerts.
- Apply and manage SAP maintenances (hot packages and kernel upgrades) through all systems using a structured methodology.
- Overall System Monitoring (work processes, users, system logs, short dumps, locks, developer traces, system traces, disk space, etc.).
- Analysis of ABAP dumps and system logs errors messages and provide solution.
- Implementing SAP OSS Notes as requested from customer and after coordinating with specialist.
- Apply and maintain certificates on SAP ABAP, JAVA, web dispatcher etc.
- Maintain documentation and procedures on all provided Services, in a supportive role.
Skills and qualifications:
Must Have Skills:
- 4 to 8 years SAP BASIS administration experience.
- Hold at least a bachelor's degree in an IT-related field.
- Excellent communication skills (English - verbal and written).
- Extensive knowledge of information technology, and a solid understanding of SAP software, configuration, and maintenance.
- Strong Linux and Windows administration hands-on experience.
- Good knowledge in one or more of the following databases (SQL, MaxDB, DB2, Oracle, HANA, Sybase ASE) and management tools.
- SAP Security (Basic knowledge and understanding)
- Any SAP certifications in NW administration.
- Motivated and quick learner.
Nice to have skills:
- Constantly improve, continuous education and keeping up to date with the rapidly evolving SAP industry.
- Ability to participate across various teams, cultures, geographies, and time zones.
- Ability to handle multiple tasks simultaneously.
- Innovative problem solving & analysis.
- Good team player.
Benefits
- Work from Home set-up
- Comprehensive medical benefits
- Gratuity, PF, EPS and Bonus, NPS
- Shift Allowances
- On-call Allowance
- Health and wellness Allowances
- Learning and Development Allowances
- No question asked certification policy.
- Certification Bounty Bonus
Package: 24.6 LPA.
Skills: The candidate should have a good experience in SAP Basis working on at least two of the databases: HANA, SQL, Oracle, DB2.
We will invite candidates for the selection process who meet the following criteria:
- Graduates/post graduates from the computer stream only (16 years of education – 10+2+4 or 10+3+3) having passed out in 2023/24
- 65% minimum marks in all semesters and cleared inonego
- Excellent communication skills - written and verbal in English
Further details on salary, benefits, location and working hours:
- Compensation: Rs. 20,000/- (inclusive of PF) stipend for first 3 months while on training and on successful completion Rs. 4LPA.
-Location: Pune
- Working hours: UK timing (8am – 5pm).
-Health insurance coverage of Rs 5L will be provided for the duration of your employment
Selection process
The selection process will consist of an aptitude assessment (1.5 hr), followed by a technical round (Java and SQL - MCQ) test (20 min) and a Java programming assignment. After clearing all the above rounds, the next round will be a personal interview.
We request that you only reply to this email message if you meet the selection criteria and agree with the terms and conditions. Please also mention which position you are applying for.
We will also ask you to answer the following screening questions if you wish to apply for any of these open vacancies.
Why shouldOnepointconsider you for an interview?
Which values are important for you at the workplace and why?
Where would you like to be in 2 to 3 years’ time in terms of your career?
at Cognitive Clouds Software Pvt Ltd
- 1-5 years of experience in Java Development.
- Very good with DSA - Java programming.
- Hands-on experience in working with REST APIs & Web Services.
- Good Database design skills - SQL
- Write well-designed, testable, efficient code.
- Contribute to all phases of the development lifecycle.
Work Location : Bangalore
Work mode : Work from office.
Job Description:
We are seeking a skilled Power BI Developer with 4 to 7 years of experience to join our dynamic team in Mumbai. The ideal candidate will have a strong background in data visualization, data modeling, and the ability to work with large datasets to provide actionable insights through Power BI.
Skill: Power BI
Experience: 4 to 7 years
Location: Mumbai (Goregaon)
Notice Period: 0 to 15 days
Key Responsibilities:
- Develop and Maintain Dashboards & Reports:
- Design, build, and deploy Power BI reports and dashboards using data from various sources, ensuring they are intuitive and user-friendly.
- Data Modeling:
- Develop and maintain efficient data models using Power BI, ensuring data accuracy and performance optimization.
- DAX Queries & Calculations:
- Write complex DAX queries to implement calculations and business logic, optimizing the performance of Power BI dashboards and reports.
- Data Integration:
- Connect Power BI to various data sources (SQL Server, Excel, APIs, etc.) and work with ETL processes to transform raw data into usable insights.
- Performance Optimization:
- Optimize report performance through best practices in data modeling, query tuning, and Power BI report development.
- Collaboration with Stakeholders:
- Work closely with business users, stakeholders, and data analysts to gather and understand requirements, translating them into effective Power BI solutions.
Required Skills:
- Strong proficiency in Power BI, including Power Query, Power Pivot, and DAX.
- Experience with data modeling and performance tuning in Power BI.
- Hands-on experience with SQL, and the ability to write complex queries.
- Knowledge of ETL tools and processes.
- Experience working with different types of data sources (SQL, Excel, APIs).
- Familiarity with cloud platforms like Azure or AWS is a plus.
- Strong analytical and problem-solving skills.
- Ability to communicate effectively with both technical and non-technical stakeholders.
- Excellent attention to detail, with a focus on delivering high-quality, accurate reports.
Company: CorpCare
Title: Lead Engineer (Full stack developer)
Location: Mumbai (work from office)
CTC: Commensurate with experience
About Us:
CorpCare is India’s first all-in-one corporate funds and assets management platform. We offer a single-window solution for corporates, family offices, and HNIs. We assist corporates in formulating and managing treasury management policies and conducting reviews with investment committees and the board.
Job Summary:
The Lead Engineer will be responsible for overseeing the development, implementation, and management of our corporate funds and assets management platform. This role demands a deep understanding of the broking industry/Financial services industry, software engineering, and product management. The ideal candidate will have a robust background in engineering leadership, a proven track record of delivering scalable technology solutions, and strong product knowledge.
Key Responsibilities:
- Engineering Strategy and Vision:
- Develop and communicate a clear engineering vision and strategy aligned with our broking and funds management platform.
- Conduct market research and technical analysis to identify trends, opportunities, and customer needs within the broking industry.
- Define and prioritize the engineering roadmap, ensuring alignment with business goals and customer requirements.
- Lead cross-functional engineering teams (software development, QA, DevOps, etc.) to deliver high-quality products on time and within budget.
- Oversee the entire software development lifecycle, from planning and architecture to development and deployment, ensuring robust and scalable solutions.
- Write detailed technical specifications and guide the engineering teams to ensure clarity and successful execution.
- Leverage your understanding of the broking industry to guide product development and engineering efforts.
- Collaborate with product managers to incorporate industry-specific requirements and ensure the platform meets the needs of brokers, traders, and financial institutions.
- Stay updated with regulatory changes, market trends, and technological advancements within the broking sector.
- Mentor and lead a high-performing engineering team, fostering a culture of innovation, collaboration, and continuous improvement.
- Recruit, train, and retain top engineering talent to build a world-class development team.
- Conduct regular performance reviews and provide constructive feedback to team members.
- Define and track key performance indicators (KPIs) for engineering projects to ensure successful delivery and performance.
- Analyze system performance, user data, and platform metrics to identify areas for improvement and optimization.
- Prepare and present engineering performance reports to senior management and stakeholders.
- Work closely with product managers, sales, marketing, and customer support teams to align engineering efforts with overall business objectives.
- Provide technical guidance and support to sales teams to help them understand the platform's capabilities and competitive advantages.
- Engage with customers, partners, and stakeholders to gather feedback, understand their needs, and validate engineering solutions.
Requirements:
- BE /B. Tech - Computer Science from a top engineering college
- MBA a plus, not required
- 5+ years of experience in software engineering, with at least 2+ years in a leadership role.
- Strong understanding of the broking industry and financial services industry.
- Proven track record of successfully managing and delivering complex software products.
- Excellent communication, presentation, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Experience with Agile/Scrum methodologies.
- Deep understanding of software architecture, cloud computing, and modern development practices.
Technical Expertise:
- Front-End: React, Next.js, JavaScript, HTML5, CSS3
- Back-End: Node.js, Express.js, RESTful APIs
- Database: MySQL, PostgreSQL, MongoDB
- DevOps: Docker, Kubernetes, AWS (EC2, S3, RDS), CI/CD pipelines
- Version Control: Git, GitHub/GitLab
- Other: TypeScript, Webpack, Babel, ESLint, Redux
Preferred Qualifications:
- Experience in the broking or financial services industry.
- Familiarity with data analytics tools and methodologies.
- Knowledge of user experience (UX) design principles.
- Experience with trading platforms or financial technology products.
This role is ideal for someone who combines strong technical expertise with a deep understanding of the broking industry and a passion for delivering high-impact software solutions.
You will be working hands-on on a complex and compound product that has the potential to be used by millions of sales and marketing people around the world. Your contribution to delivering an excellent product platform that:
- enables quick iteration
- supports product customization
- and handles scale
What do we expect you to have?
- 2+ years of experience in backend engineering
- An intent to learn and an urge to build a product by learning different technologies
- Interest in writing complex, scalable, and maintainable backend applications
- Tech stack requirements:
Must haves
- Experience in building application server in Java (Spring / Spring boot) / NodeJS / Golang / Python
- Experience in using SQL databases and designing schemas based on application need
- Experience with container services and runtimes (docker / docker-compose / k8s)
- Experience with cloud paas (AWS / GCP / Azure cloud)
- Experience and familiarity with microservices’ concepts
- Experience with bash scripting
Good to have (Preferred)
- Preferred experience with org wide message queue (rabbitmq / aws sqs)
- Preferred experience with task orchestration services (apache airflow / aws step function)
- Preferred experience with infra as code (or system configuration) tools (terraform / chef / ansible)
- Preferred experience with build essential tools (make / makefile)
- Preferred experience with monitoring and tracing systems for performance / system / application monitoring (grafana + loki + prometheus / aws cloudwatch)
What will you learn?
- Building highly available, complex, compound, performant systems of microservices platform that acts as an API layer
- Industry-standard state-of-the-art tools + methodologies + frameworks + infra for building a product.
- Fable is not a trivial CRUD app. It requires a lot of consideration and care for building the API layer as the product is highly customizable per user.
- How different functions (sales, marketing, product, engineering) in a high-velocity product company work in synergy to deliver an iterative product in real life.
Who would you be working with?
- You would be directly working with the co-founder & CTO who has built multiple companies before and has built large teams in large-scale companies like ThoughtSpot, Unacademy, etc.
Position details
- Fully remote.
- 5 days/week (all public and government holidays will be non-working days).
- No specific work hours (we will sync over zoom over the course of the day).
Job Description: AI/ML Engineer
Location: Bangalore (On-site)
Experience: 2+ years of relevant experience
About the Role:
We are seeking a skilled and passionate AI/ML Engineer to join our team in Bangalore. The ideal candidate will have over two years of experience in developing, deploying, and maintaining AI and machine learning models. As an AI/ML Engineer, you will work closely with our data science team to build innovative solutions and deploy them in a production environmen
Key Responsibilities:
- Develop, implement, and optimize machine learning models.
- Perform data manipulation, exploration, and analysis to derive actionable insights.
- Use advanced computer vision techniques, including YOLO and other state-of-the-art methods, for image processing and analysis.
- Collaborate with software developers and data scientists to integrate AI/ML solutions into the company's applications and products.
- Design, test, and deploy scalable machine learning solutions using TensorFlow, OpenCV, and other related technologies.
- Ensure the efficient storage and retrieval of data using SQL and data manipulation libraries such as pandas and NumPy.
- Contribute to the development of backend services using Flask or Django for deploying AI models.
- Manage code using Git and containerize applications using Docker when necessary.
- Stay updated with the latest advancements in AI/ML and integrate them into existing projects.
Required Skills:
- Proficiency in Python and its associated libraries (NumPy, pandas).
- Hands-on experience with TensorFlow for building and training machine learning models.
- Strong knowledge of linear algebra and data augmentation techniques.
- Experience with computer vision libraries like OpenCV and frameworks like YOLO.
- Proficiency in SQL for database management and data extraction.
- Experience with Flask for backend development.
- Familiarity with version control using Git.
Optional Skills:
- Experience with PyTorch, Scikit-learn, and Docker.
- Familiarity with Django for web development.
- Knowledge of GPU programming using CuPy and CUDA.
- Understanding of parallel processing techniques.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Demonstrated experience in AI/ML, with a portfolio of past projects.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork skills.
Why Join Us?
- Opportunity to work on cutting-edge AI/ML projects.
- Collaborative and dynamic work environment.
- Competitive salary and benefits.
- Professional growth and development opportunities.
If you're excited about using AI/ML to solve real-world problems and have a strong technical background, we'd love to hear from you!
Apply now to join our growing team and make a significant impact!
Thirumoolar IT Solutions is looking for a motivated and enthusiastic Fresher Trained Dataset Engineer to join our team. This entry-level position is ideal for recent graduates who are eager to apply their academic knowledge in a practical setting and contribute to the development of high-quality datasets for machine learning applications.
Responsibilities
Assist in the collection, cleaning, and preprocessing of data to ensure it is ready for training machine learning models.
Collaborate with senior dataset engineers and data scientists to understand the requirements for specific machine learning tasks.
Participate in the annotation and labeling of datasets, ensuring accuracy and consistency in data representation.
Conduct quality checks on datasets to identify and rectify errors or inconsistencies.
Support the development of documentation and guidelines for data annotation processes.
Stay updated with the latest tools and techniques in data processing and machine learning.
Skills and Qualifications
Bachelor’s degree in Computer Science, Data Science, Mathematics, or a related field.
Basic understanding of machine learning concepts and the importance of high-quality datasets.
Familiarity with programming languages such as Python or R is a plus.
Knowledge of data manipulation libraries (e.g., Pandas, NumPy) is advantageous.
Strong analytical skills and attention to detail.
Excellent communication and teamwork abilities.
A passion for learning and a desire to grow in the field of data engineering.
Preferred Location
Candidates based in Tamil Nadu or those willing to work from home are encouraged to apply.
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
Job Title: Inventory & Product Analytics Specialist
Location: Indore (M.P.)
Job Type: Full-Time
Job Summary
We are seeking a highly analytical and detail-oriented Inventory & Product Analytics Specialist to join our team. The successful candidate will be responsible for analyzing inventory data, monitoring product performance, and providing actionable insights to optimize inventory levels, improve product availability, and drive overall business performance. This role requires strong analytical skills, an understanding of inventory management, and the ability to work collaboratively across departments.
Key Responsibilities
- Inventory Management & Analysis:
- Monitor and analyze inventory levels to ensure optimal stock levels across various product categories.
- Identify slow-moving, overstocked, and out-of-stock items, and provide actionable recommendations to manage inventory flow.
- Develop and maintain inventory forecasting models using historical data, trends, and market demands.
- Conduct regular audits to ensure data accuracy in the inventory management system.
- Provide reports on key inventory metrics such as stock turnover, days of inventory, and reorder points.
- Product Analytics:
- Analyze product performance, including sales trends, profit margins, and product lifecycles, to drive decision-making.
- Track and report on key performance indicators (KPIs) related to product sales, including top-performing and underperforming items.
- Collaborate with product development, marketing, and procurement teams to evaluate product trends and suggest strategies to enhance product offerings.
- Assist in setting pricing strategies based on data analysis of product performance and market conditions.
- Data Reporting & Visualization:
- Create and maintain dashboards and reports to track inventory levels, product performance, and overall supply chain health.
- Present insights and recommendations to key stakeholders, including management, operations, and marketing teams.
- Provide ad-hoc data analysis as required by various departments.
- Process Improvement:
- Identify inefficiencies in inventory management processes and work with cross-functional teams to implement improvements.
- Suggest automation and technology solutions for inventory tracking and reporting.
- Stay updated on industry trends, inventory management tools, and analytics best practices.
Qualifications
- Education:
- Bachelor’s degree in Business, Supply Chain Management, Data Analytics, or a related field.
- Master’s degree is a plus.
- Experience:
2-5 years of experience in inventory management, product analytics, data analysis, or a similar role.
- Experience in retail, e-commerce, or manufacturing industries preferred.
- Technical Skills:
- Proficiency in Excel, SQL, and other data analysis tools (e.g., Python, R) is a plus.
- Experience with inventory management systems (e.g., SAP, Oracle, NetSuite) and product analytics platforms.
- Knowledge of data visualization tools (e.g., Power BI, Tableau, Looker).
- Key Competencies:
- Strong analytical and problem-solving skills.
- Excellent attention to detail and ability to work with large datasets.
- Effective communication skills, with the ability to present complex data in a clear and concise manner.
- Ability to work cross-functionally and collaboratively with various teams.
Benefits:
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
at Wissen Technology
PFA the JD and Company Description
Experience:
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.
Responsibilities:
- As a Data Integration Developer/Sr Developer, be hands-on ETL/ELT data pipelines, Snowflake DWH, CI/CD deployment Pipelines and data-readiness(data quality) design, development, implementation and address code or data issues.
- Experience in designing and implementing modern data pipelines for a variety of data sets which includes internal/external data sources, complex relationships, various data formats and high-volume.
- Experience and understanding of ETL Job performance techniques, Exception handling,
Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases.
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions.
- Demonstrate strong collaborative experience across regions (APAC,EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities.
- Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories.
- Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders.
About Wissen Technology:
• The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
• Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
• Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
• Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
• Globally present with offices US, India, UK, Australia, Mexico, and Canada.
• We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
• Wissen Technology has been certified as a Great Place to Work®.
• Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
• Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
We have served client across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include likes of Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com
Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.
Real-time marketing automation built on an intelligent & secure Customer Data Platform increases conversions, retention & growth for enterprises.
Responsibilities:
- Design and Develop large scale sub-systems
- To periodically explore latest technologies (esp Open Source) and prototype sub-systems
- Be a part of the team that develops the next-gen Targeting platform
- Build components to make the customer data platform more efficient and scalable
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
- Good knowledge of Databases - SQL, NoSQL
- Knowledge of Unit Testing a plus
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain
- Excellent written and verbal communication skills
- Active & contributing team member
- Strong work ethic with demonstrated ability to meet and exceed commitments
- Others: Experience of having worked in a start-up is a plus
at REConnect Energy
Work at the Intersection of Energy, Weather & Climate Sciences and Artificial Intelligence
About the company:
REConnect Energy is India's largest tech-enabled service provider in predictive analytics and demand-supply aggregation for the energy sector. We focus on digital intelligence for climate resilience, offering solutions for efficient asset and grid management, minimizing climate-induced risks, and providing real-time visibility of assets and resources.
Responsibilities:
- Design, develop, and maintain data engineering pipelines using Python.
- Implement and optimize database solutions with SQL and NOSQL Databases (MySQL and MongoDB).
- Perform data analysis, profiling, and quality assurance to ensure high service quality standards.
- Troubleshoot and resolve data-pipeline related issues, ensuring optimal performance and reliability.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
- Participate in code reviews and contribute to the continuous improvement of the codebase.
- Utilize GitHub for version control and collaboration.
- Implement and manage containerization solutions using Docker.
- Implement tech solutions to new product development, ensuring scalability, performance, and security.
Requirements:
- Bachelors or Master’s degree in Computer Science, Software Engineering, Electrical Engineering or equivalent.
- Proficient in Python programming skills and expertise with data engineering.
- Experience in databases including MySQL and NoSQL.
- Experience in developing and maintaining critical and high availability systems will be given strong preference.
- Experience working with AWS cloud platform.
- Strong analytical and data-driven approach to problem solving.
Node.js Developer / NestJS Developer – Job Description
A Bachelor’s Degree or Master’s Degree in Computer Science is preferred with excellent problem solving skills.
Job Type: Full-time
Job Location: Bengaluru (on site)
Preferred Skills: TypeScript / Nodejs, SQL/ MySQL
Experience: Min 2yrs in similar Role.
Responsibilities:
- Develop and Maintain Server-side Logic: Design, implement, and maintain the server-side logic using Node.js, ensuring high performance and responsiveness to requests from the front-end.
- API Development: Build and maintain RESTful APIs for seamless integration with front-end services and third-party applications.
- Database Management: Work with databases (such as MongoDB, MySQL, PostgreSQL) to ensure data consistency, reliability, and optimal performance.
- Code Quality and Testing: Write clean, maintainable, and efficient code. Implement automated testing platforms and unit tests.
- Collaboration: Work closely with front-end developers, designers, and other team members to define and implement technical solutions that meet business requirements.
- Troubleshooting and Debugging: Identify issues, debug, and resolve bugs and other technical problems in a timely manner.
- Documentation: Create and maintain documentation related to application development, API usage, and system operations.
- Stay Updated: Keep up-to-date with the latest industry trends and technologies to ensure the application remains modern and competitive.
Role Description
This is a full-time remote role for a SFMC Developer at Cloudsheer. The SFMC Developer will be responsible for implementing projects, optimizing performance, and ensuring reliability in digital solutions. Day-to-day tasks include coding, testing, and maintaining Salesforce Marketing Cloud platforms.
Qualifications
- Proficiency in Salesforce Marketing Cloud (SFMC) development
- Experience in coding, testing, and maintaining SFMC platforms
- Knowledge of SQL, AMP script, and HTML for SFMC implementation
- Understanding of digital marketing strategies and automation
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
- Ability to work independently and in a remote team environment
- Bachelor's degree in Computer Science, Information Technology, or related field
Required Skill Set :--
- Data Model & Mapping
- MS SQL Database
- Analytics SQL Query
- Genesys Cloud Reporting & Analytics API
- Snow Flake (Good to have)
- Cloud Exposure – AWS or Azure
Technical Experience –
· 5 - 8 Years of experience, preferable at technology or Financial firm
· Strong understanding of data analysis & reporting tools.
· Experience with data mining & machine learning techniques.
· Excellent communication & presentation skills
· Must have at least 2 – 3 years of experience in Data Model/Analysis /mapping
· Must have hands on experience in database tools & technologies
· Must have exposure to Genesys cloud, WFM, GIM, Genesys Analytics API
· Good to have experience or exposure on salesforce, AWS or AZUre , & Genesys cloud
· Ability to work independently & as part of a team
· Strong attention to detail and accuracy.
Work Scope –
- Data Model similar GIM database based on the Genesys Cloud data.
- API to column data mapping.
- Data Model for business for Analytics
- Data base artifacts
- Scripting – Python
- Autosys, TWS job setup.
Alternative Path is looking for an application developer, to assist one of its clients, which is a SAAS platform helping alternative investment firms to streamline their document collection and data extraction process using Machine Learning. You will work with individuals in various departments of the company to define and craft new products and features for our platform, and to improve existing ones. You will have a large degree of independence and trust, but you won't be isolated, the support of the Engineering team leads, the Product team leads, and every other technology team member is behind you.
You will bring your projects from initial conception through all the cycles of development from project definition to development, debugging, initial release and subsequent iteration. You will also take part in shaping the architecture of the product, including our deployment infrastructure, to fit the growing needs of the platform.
Key Responsibilities
- This is a backend-heavy Fullstack role.
- Develop front and back-end-related product features for optimal user experience
- Design intuitive user interactions on web pages
- Spin up servers and databases while ensuring stability and scalability of applications
- Work alongside graphic designers to enhance web design features
- Oversee and drive projects from conception to finished product
- Design and develop APIs
- Brainstorm, execute and deliver solutions that meet both technical and consumer needs
- Staying abreast of developments in web applications and programming languages
Desired Skills
- 2-4 years of web application development experience
- Python development and architecture
- Prior work experience of working on Django or Flask framework
- Knowledge of HTML, CSS, JavaScript and React.
- Familiar with agile development environment, continuous integration and continuous deployment
- Familiar with OOP, MVC, and commonly used design patterns
- Knowledge of SQL and relational databases
- Experience of working on one or more AWS services like - AWS EC2, AWS S3, AWS
- Managed Redis, AWS Elastic Search, AWS Managed Airflow, RDS, S3 is preferred but not mandatory
- Comfortable with continuous integration, automated testing, source control, and other DevOps methodologies
We are seeking a talented and experienced ServiceNow Developer to join our dynamic team. The ideal candidate will have a strong background in developing and customizing ServiceNow applications, with a deep understanding of the platform's modules and capabilities. This role offers an exciting opportunity to work on diverse projects and collaborate with leading industry professionals.
Key Responsibilities:
- Develop and customize ServiceNow applications and modules based on business requirements.
- Create and manage ServiceNow business rules, script includes, UI actions, and other scripting elements.
- Design and implement ServiceNow integrations using REST and SOAP web services.
- Configure and customize ServiceNow forms, lists, reports, and dashboards.
- Utilize ServiceNow Flow Designer and Orchestration to automate workflows.
- Collaborate with cross-functional teams to gather requirements and deliver effective solutions.
- Troubleshoot and resolve technical issues related to ServiceNow.
- Maintain system documentation and user guides.
Requirements:
- Proficiency in ServiceNow modules, including ITSM, ITOM, ITBM, and ITAM.
- Strong knowledge of JavaScript for client-side and server-side scripting.
- Experience with ServiceNow Studio, Flow Designer, and IntegrationHub.
- Familiarity with REST and SOAP web services for system integrations.
- Understanding of relational databases and SQL.
- ServiceNow Certified System Administrator and/or ServiceNow Certified Application Developer (preferred).
- Previous experience in a ServiceNow Developer role with a track record of successful projects.
- Strong problem-solving skills and the ability to communicate technical concepts to non-technical stakeholders.
- Experience with Agile methodologies and project management principles.
Preferred Qualifications:
- Experience with other ITSM tools and platforms.
- Additional ServiceNow certifications (e.g., Certified Implementation Specialist) are a plus.
- A degree in Computer Science, Information Technology, or a related field.
About us
Fisdom is one of the largest wealthtech platforms that allows investors to manage their wealth in an intuitive and seamless manner. Fisdom has a suite of products and services that takes care of every wealth requirement that an individual would have. This includes Mutual Funds, Stock Broking, Private Wealth, Tax Filing, and Pension funds
Fisdom has a B2C app and also an award-winning B2B2C distribution model where we have partnered with 15 of the largest banks in India such as Indian Bank and UCO Bank to provide wealth products to their customers. In our bank-led distribution model, our SDKs are integrated seamlessly into the bank’s mobile banking and internet banking application. Fisdom is the first wealthtech company in the country to launch a stock broking product for customers of a PSU bank.
The company is breaking down barriers by enabling access to wealth management to underserved customers. All our partners combined have a combined user base of more than 50 crore customers. This makes us uniquely placed to disrupt the wealthtech space which we believe is in its infancy in India in terms of wider adoption.
Where are we now and where are we heading towards
Founded by veteran VC-turned entrepreneur Subramanya SV(Subu) and former investment
banker Anand Dalmia, Fisdom is backed by PayU (Naspers), Quona Capital, and Saama Capital; with $37million of total funds raised so far. Fisdom is known for its revenue and profitability focussed approach towards sustainable business.
Fisdom is the No.1 company in India in the B2B2C wealthtech space and one of the most admired companies in the fintech ecosystem for our business model. We look forward to growing the leadership position by staying focussed on product and technology innovation.
Our technology team
Today we are a 60-member strong technology team. Everyone in the team is a hands-on engineer, including the team leads and managers. We take pride in being product engineers and we believe engineers are fundamentally problem solvers first. Our culture binds us together as one cohesive unit. We stress on engineering excellence and strive to become a high talent density team. Some values that we preach and practice include:
- Individual ownership and collective responsibility
- Focus on continuous learning and constant improvement in every aspect of engineering and product
- Cheer for openness, inclusivity and transparency
- Merit-based growth
What we are looking for
- Are open to work in a flat, non-hierarchical setup where daily focus is only shipping features not reporting to managers
- Experience designing highly interactive web applications with performance, scalability, accessibility, usability, design, and security in mind.
- Experience with distributed (multi-tiered) systems, algorithms, and relational and no-sql databases.
- Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product
- Experience with architectural trade-offs, applying synchronous and asynchronous design patterns, and delivering with speed while maintaining quality.
- Raise the bar on sustainable engineering by improving best practices, producing best in class of code, documentation, testing and monitoring.
- Contributes in code and actively takes part in code reviews.
- Working with the Product Owner/managers to clearly define the scope of multiple sprints. Lead/guide the team through sprint(s) scoping, resource allocation and commitment - the execution plan.
- Drives feature development end-to-end. Active partner with product, design, and peer engineering leads and managers.
- Familiarity with build, release, deployment tools such as Ant, Maven, and Gradle, Docker, Kubernetes, Jenkins etc.
- Effective at influencing a culture of engineering craftsmanship and excellence
- Helps the team make the right choices. Drives adoption of engineering best practices and development processes within their team.
- Understanding security and compliance.
- User authentication and authorisation between multiple systems, servers, and environments.
- Based on your experience, you may lead a small team of Engineers.
If you don't have all of these, that's ok. But be excited about learning the few you don't know.
Skills
Microservices, Engineering Management, Quality management, Technical Architecture, technical lead. Hands-on programming experience in one of languages: Python, Golang.
Additional perks
- Access to large repositories of online courses through Myacademy (includes Udemy, Coursera, Harvard ManageMentor, Udacity and many more). We strongly encourage learning something outside of work as a habit.
- Career planning support/counseling / coaching support. Both internal and external coaches.
- Relocation policy
You will not be a good fit for this role if
- you have experience of only working with services companies or have spent a major part of your time there
- you are not open to shifting to new programming language or stack but exploring a position aligned to your current technical experience
- you are not very hands-on, seek direction constantly and need continuous supervision from a manager to finish tasks
- you like to working alone and mentoring junior engineers does not interest you
- you are looking to work in very large teams
Why join us and where?
We're a small but high performing engineering team. We recognize that the work we do impacts the lives of hundreds and thousands of people. Your work will contribute significantly to our mission. We pay competitive compensation and performance bonuses. We provide a high energy work environment and you are encouraged to play around new technology and self-learning. You will be based out of Bangalore
About us
Fisdom is one of the largest wealthtech platforms that allows investors to manage their wealth in an intuitive and seamless manner. Fisdom has a suite of products and services that takes care of every wealth requirement that an individual would have. This includes Mutual Funds, Stock Broking, Private Wealth, Tax Filing, and Pension funds
Fisdom has a B2C app and also an award-winning B2B2C distribution model where we have partnered with 15 of the largest banks in India such as Indian Bank and UCO Bank to provide wealth products to their customers. In our bank-led distribution model, our SDKs are integrated seamlessly into the bank’s mobile banking and internet banking application. Fisdom is the first wealthtech company in the country to launch a stock broking product for customers of a PSU bank.
The company is breaking down barriers by enabling access to wealth management to underserved customers. All our partners combined have a combined user base of more than 50 crore customers. This makes us uniquely placed to disrupt the wealthtech space which we believe is in its infancy in India in terms of wider adoption.
Where are we now and where are we heading towards
Founded by veteran VC-turned entrepreneur Subramanya SV(Subu) and former investment
banker Anand Dalmia, Fisdom is backed by PayU (Naspers), Quona Capital, and Saama Capital; with $37million of total funds raised so far. Fisdom is known for its revenue and profitability focussed approach towards sustainable business.
Fisdom is the No.1 company in India in the B2B2C wealthtech space and one of the most admired companies in the fintech ecosystem for our business model. We look forward to growing the leadership position by staying focussed on product and technology innovation.
Our technology team
Today we are a 60-member strong technology team. Everyone in the team is a hands-on engineer, including the team leads and managers. We take pride in being product engineers and we believe engineers are fundamentally problem solvers first. Our culture binds us together as one cohesive unit. We stress on engineering excellence and strive to become a high talent density team. Some values that we preach and practice include:
- Individual ownership and collective responsibility
- Focus on continuous learning and constant improvement in every aspect of engineering and product
- Cheer for openness, inclusivity and transparency.
- Merit-based growth
Key Responsibilities
- Write code, build prototypes and resolve issues.
- Write and review unit test cases.
- Review code & designs for both oneself and team members
- Defining and building microservices
- Building systems with positive business outcome
- Tracking module health, usage & behaviour tracking.
Key Skills
- An engineer with at least 1-3 years of working experience in web services, preferably in Python
- Must have a penchant for good API design.
- Must be a stickler for good, clear and secure coding.
- Must have built and released APIs in production.
- Experience in working with RDBMS & NoSQL databases.
- Working knowledge of GCP, AWS, Azure or any other cloud provider.
- Aggressive problem diagnosis & creative problem solving skills.
- Communication skills, to speak to developers across the world
Why join us and where?
We're a small but high performing engineering team. We recognize that the work we do impacts the lives of hundreds and thousands of people. Your work will contribute significantly to our mission. We pay competitive compensation and performance bonuses. We provide a high energy work environment and you are encouraged to play around new technology and self-learning. You will be based out of Bangalore.
About the Role
We are actively seeking talented Senior Python Developers to join our ambitious team dedicated to pushing the frontiers of AI technology. This opportunity is tailored for professionals who thrive on developing innovative solutions and who aspire to be at the forefront of AI advancements. You will work with different companies in the US who are looking to develop both commercial and research AI solutions.
Required Skills:
- Write effective Python code to tackle complex issues
- Use business sense and analytical abilities to glean valuable insights from public databases
- Clearly express the reasoning and logic when writing code in Jupyter notebooks or other suitable mediums
- Extensive experience working with Python
- Proficiency with the language's syntax and conventions
- Previous experience tackling algorithmic problems
- Nice to have some prior Software Quality Assurance and Test Planning experience
- Excellent spoken and written English communication skills
The ideal candidates should be able to
- Clearly explain their strategies for problem-solving.
- Design practical solutions in code.
- Develop test cases to validate their solutions.
- Debug and refine their solutions for improvement.
5 years in software development (Minimum 3 years)
Strong expertise in Ruby on Rails (3-5 years)
Knowledge of Python is a plus
hashtag
#Key hashtag
#Skills:
Proficiency in scalable app techniques: caching, APM, microservices architecture
Ability to write high-quality code independently
Experience mentoring junior engineers (0-2 years of experience)
What We Offer:
An opportunity to work with a dynamic team
A challenging environment where your skills will be put to the test
A chance to make a real impact by guiding and mentoring others
Ready to make your mark? If you're based out of or willing to relocate to Gurgaon and have the experience we're looking for, apply now!
Roles and Responsibilities:
- Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Ensure designs comply with specifications
- Prepare and produce releases of software components
- Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
- Ensure continual knowledge management
- Adherence to the organizational guidelines and processes
Skills /Competencies: a. Bachelor/Master’s degree with good experience in computer programming b.4+ years working experience in application development using Java
Essential Skills:
- Hands on experience in designing and developing applications using Java EE platforms
- Object Oriented analysis and design using common design patterns.
- Profound insight of Java and JEE internals (Data structure, Algorithm and time complexity, Memory Management, Transaction management etc)
- Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate)
- Experience in the Spring Framework
- Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC) and UI technology (Angular/React JS)
- Ability to operate independently while establishing strong working relationships with co-workers and cross-functional teams
- Strong organizational and prioritization skills
- Demonstrate critical attention to detail and deadlines, and are self-motivated
- Ability to adapt to changes in direction and priorities in a project and deadline-oriented environment
- Strong written and verbal English communication skills
- Problem-solving attitude
Preferred skills Good to have –
- Knowledge in any UI technology (Angular, React, JS)
- Intermediate level knowledge of Unix environment (User commands, not System Admin commands)
- Understanding of capital markets and middle/back office processes in the financial services space
Experience: 1-3 years
Location: Bangalore
Notice:Immediate Joiner
ResponsibilitiesTo attend to user issue and handle tickets raised by user
- Strong experience in SQL , PostgreSQL, PL/SQL
- Providing L2 support depending upon the priority of issue to meet client SLA.
- Regularly monitoring S-MAX ticketing tool.
- Incident management- logging, prioritizing and resolving/debugging incidents. Worked on various monitoring tools like open bravo, GCP.
- Writing the SQL queries as per the business need.
- JAVA HTML, CSS Modify existing Shell Script on Unix Platform whenever required. Interact with Global Customers / Users. Scheduling the jobs through Crontab.
- TOOLS Regularly checking the business mail and reply to the clients or the respective team
- Excellent written and verbal communication skills in English with the ability to clearly articulate solutions to complex technical problems Ability to work with Business heads, administrators and developers Excellent time management skills
Total Experience:1-2 Years
Job Type: Contract
Notice: Immediate Joiner
- 1-2 years proven track record of development and design work in the IT industry, preferably in a software product based organization
- Java, springboot, and Microservices
- SQL
- Startup product experience – hustled through various tech, products, stacks
- Having strong experience in Data structures and Algorithms.(Must).
- Good to have experience in Complex Problem solving
- Architect and Develop: Design, implement, and maintain high-performance backend services using Java/Golang and intuitive frontend interfaces using React.
- Technical Leadership: Provide technical guidance and mentorship to junior developers, promoting best practices and fostering a collaborative environment.
- Code Quality: Write clean, efficient, and well-documented code following industry best practices and coding standards.
- Collaboration: Work with backend developers, frontend developers, product managers, and other stakeholders to gather requirements and deliver robust solutions.
- Performance Optimization: Identify and address performance bottlenecks and scalability issues.
- Debugging and Troubleshooting: Diagnose and resolve complex issues in both backend and frontend components.
- Testing: Implement comprehensive testing strategies, including unit tests, integration tests, and end-to-end tests.
- Continuous Learning: Stay current with the latest industry trends, technologies, and best practices in full stack development.
- Design and Develop: Architect, design, and implement high-performance Java-based backend services and applications.
- Code Quality: Write clean, efficient, and well-documented code following industry best practices and coding standards.
- Technical Leadership: Provide technical guidance and mentorship to junior developers, promoting best practices and fostering a collaborative environment.
- Collaboration: Work closely with frontend developers, product managers, and other stakeholders to understand requirements and deliver robust solutions.
- Performance Optimization: Identify and resolve performance bottlenecks and scalability issues.
- Testing: Implement comprehensive testing strategies, including unit tests, integration tests, and end-to-end tests.
- Continuous Improvement: Stay current with the latest industry trends, technologies, and best practices in Java/Golang development, and continuously improve our development processes.
PlanetSpark is Hiring !!
Title of the Job : Data Analyst ( FULL TIME)
Location : Gurgaon
Roles and Responsibilities/Mission Statement:
We are seeking an experienced Data Analyst to join our dynamic team. The ideal candidate will possess a strong analytical mindset, excellent problem-solving skills, and a passion for uncovering actionable insights from data. As a Data Analyst, you will be responsible for collecting, processing, and analyzing large datasets to help inform business decisions and strategies and will be the source of company wide intelligence
The responsibilities would include :
1) Creating a robust Sales MIS
2) Tracking key metrics of the company
3) Reporting key metrics on a daily basis
4) Sales incentive and teacher payout calculation
5) Tracking and analyzing large volume of consumer data related to customers and teachers
6) Developing intelligence from data from various sources
Ideal Candidate Profile -
- 1-4 years of experience in a data-intensive position at a consumer business or a Big 4 firm
- Excellent ability in advanced excel
- Knowledge of other data analytics tools and software such as SQL, Python, R, Excel, and data visualization tools will be good to have.
- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.
- Exceptional analytical ability
- Detail-oriented with strong organizational skills and the ability to manage multiple projects simultaneously.
Eligibility Criteria:
- Willing to work 5 Days a week from office and Saturday - work from home
- Willing to work in an early-stage startup .
- Must have 1-3 years of prior experience in a data focused role at a consumer internet or a Big 4
- Must have excellent analytical abilities
- Available to Relocate to Gurgaon
- Candidate Should his own laptop
- Gurgaon based candidate will be given more preference
Join us and leverage your analytical expertise to drive data-driven decisions and contribute to our success. Apply today!
Who are we looking for?
We are looking for a Senior Data Scientist, who will design and develop data-driven solutions using state-of-the-art methods. You should be someone with strong and proven experience in working on data-driven solutions. If you feel you’re enthusiastic about transforming business requirements into insightful data-driven solutions, you are welcome to join our fast-growing team to unlock your best potential.
Job Summary
- Supporting company mission by understanding complex business problems through data-driven solutions.
- Designing and developing machine learning pipelines in Python and deploying them in AWS/GCP, ...
- Developing end-to-end ML production-ready solutions and visualizations.
- Analyse large sets of time-series industrial data from various sources, such as production systems, sensors, and databases to draw actionable insights and present them via custom dashboards.
- Communicating complex technical concepts and findings to non-technical stakeholders of the projects
- Implementing the prototypes using suitable statistical tools and artificial intelligence algorithms.
- Preparing high-quality research papers and participating in conferences to present and report experimental results and research findings.
- Carrying out research collaborating with internal and external teams and facilitating review of ML systems for innovative ideas to prototype new models.
Qualification and experience
- B.Tech/Masters/Ph.D. in computer science, electrical engineering, mathematics, data science, and related fields.
- 5+ years of professional experience in the field of machine learning, and data science.
- Experience with large-scale Time-series data-based production code development is a plus.
Skills and competencies
- Familiarity with Docker, and ML Libraries like PyTorch, sklearn, pandas, SQL, and Git is a must.
- Ability to work on multiple projects. Must have strong design and implementation skills.
- Ability to conduct research based on complex business problems.
- Strong presentation skills and the ability to collaborate in a multi-disciplinary team.
- Must have programming experience in Python.
- Excellent English communication skills, both written and verbal.
Benefits and Perks
- Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you.
- Progressive leave policy for effective work-life balance.
- Get mentored by highly qualified internal resource groups and opportunity to avail industry-driven mentorship program, as we believe in empowering people.
- Multicultural peer groups and supportive workplace policies.
- Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work.
Hiring Process
- Call with Talent Acquisition Team: After application screening, a first-level screening with the talent acquisition team to understand the candidate's goals and alignment with the job requirements.
- First Round: Technical round 1 to gauge your domain knowledge and functional expertise.
- Second Round: In-depth technical round and discussion about the departmental goals, your role, and expectations.
- Final HR Round: Culture fit round and compensation discussions.
- Offer: Congratulations you made it!
If this position sparked your interest, apply now to initiate the screening process.
Job Description:
We are seeking a skilled Full Stack Developer to join our growing team. As a Full Stack Developer at AlphaBI, you will play a critical role in designing, developing, and maintaining both front-end and back-end components of our web and mobile applications. If you are a proactive problem-solver with a passion for technology and a desire to work in a fast-paced environment, we want to hear from you.
Key Responsibilities:
• Develop and maintain web applications using React, JavaScript, SQL, and MongoDB.
• Collaborate with cross-functional teams to define, design, and ship new features.
• Optimize applications for maximum speed and scalability.
• Ensure the technical feasibility of UI/UX designs.
• Develop server-side logic, databases, and APIs to support front-end functionality.
• Write clean, maintainable, and well-documented code.
• Troubleshoot and debug issues to enhance application performance.
• Stay updated with emerging technologies and apply them to improve the development process.
Required Skills:
• Proficiency in React and JavaScript for front-end development.
• Strong experience with SQL and MongoDB for database management.
• Understanding of RESTful APIs and web services.
• Knowledge of version control systems like Git.
• Ability to work independently and as part of a team.
• Excellent problem-solving skills and attention to detail.
Optional Skills (a plus):
• Experience with TypeScript.
• Familiarity with Next.js for server-side rendering.
• Knowledge of Flutter for cross-platform mobile app development.
Qualifications:
• Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
• 2+ years of experience as a Full Stack Developer or similar role.
• Strong portfolio demonstrating relevant work experience.
Benefits:
• Competitive salary based on experience.
• Opportunities for professional growth and development.
• Collaborative and inclusive work environment.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required:
- Experience in the manufacturing industry (metal industry is a plus)
- 4+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have:
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
- Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
- Benefits And Perks
- A culture that fosters innovation, creativity, continuous learning, and resilience
- Progressive leave policy promoting work-life balance
- Mentorship opportunities with highly qualified internal resources and industry-driven programs
- Multicultural peer groups and supportive workplace policies
- Annual workcation program allowing you to work from various scenic locations
- Experience the unique environment of a dynamic start-up
Why should you join TVARIT ?
Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.
If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!
- Architectural Leadership:
- Design and architect robust, scalable, and high-performance Hadoop solutions.
- Define and implement data architecture strategies, standards, and processes.
- Collaborate with senior leadership to align data strategies with business goals.
- Technical Expertise:
- Develop and maintain complex data processing systems using Hadoop and its ecosystem (HDFS, YARN, MapReduce, Hive, HBase, Pig, etc.).
- Ensure optimal performance and scalability of Hadoop clusters.
- Oversee the integration of Hadoop solutions with existing data systems and third-party applications.
- Strategic Planning:
- Develop long-term plans for data architecture, considering emerging technologies and future trends.
- Evaluate and recommend new technologies and tools to enhance the Hadoop ecosystem.
- Lead the adoption of big data best practices and methodologies.
- Team Leadership and Collaboration:
- Mentor and guide data engineers and developers, fostering a culture of continuous improvement.
- Work closely with data scientists, analysts, and other stakeholders to understand requirements and deliver high-quality solutions.
- Ensure effective communication and collaboration across all teams involved in data projects.
- Project Management:
- Lead large-scale data projects from inception to completion, ensuring timely delivery and high quality.
- Manage project resources, budgets, and timelines effectively.
- Monitor project progress and address any issues or risks promptly.
- Data Governance and Security:
- Implement robust data governance policies and procedures to ensure data quality and compliance.
- Ensure data security and privacy by implementing appropriate measures and controls.
- Conduct regular audits and reviews of data systems to ensure compliance with industry standards and regulations.
Company: Optimum Solutions
About the company: Optimum solutions is a leader in a sheet metal industry, provides sheet metal solutions to sheet metal fabricators with a proven track record of reliable product delivery. Starting from tools through software, machines, we are one stop shop for all your technology needs.
Role Overview:
- Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment.
- Implementing automated testing platforms, unit tests, and CICD Pipeline
- Proficient understanding of code versioning tools, such as GitHub, Bitbucket, ADO
- Understanding of container platform, such as Docker
Job Description
- We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework.
- Your primary focus will be working the Product and Usecase delivery team to do various prompting for different Gen-AI use cases
- You will be responsible for prompting and building use case Pipelines
- Perform the Evaluation of all the Gen-AI features and Usecase pipeline
Position: AI ML Engineer
Location: Chennai (Preference) and Bangalore
Minimum Qualification: Bachelor's degree in computer science, Software Engineering, Data Science, or a related field.
Experience: 4-6 years
CTC: 16.5 - 17 LPA
Employment Type: Full Time
Key Responsibilities:
- Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models
- Design and develop prompts suiting project needs
- Lead and manage team of prompt engineers
- Stakeholder management across business and domains as required for the projects
- Evaluating base models and benchmarking performance
- Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking
- Develop, deploy and maintain auto prompt solutions
- Design and implement minimum design standards for every use case involving prompt engineering
Skills and Qualifications
- Strong proficiency with Python, DJANGO framework and REGEX
- Good understanding of Machine learning framework Pytorch and Tensorflow
- Knowledge of Generative AI and RAG Pipeline
- Good in microservice design pattern and developing scalable application.
- Ability to build and consume REST API
- Fine tune and perform code optimization for better performance.
- Strong understanding on OOP and design thinking
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Good understanding of server-side templating languages
- Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments
- Integration of APIs, multiple data sources and databases into one system
- Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server.
- Understanding fundamental design principles behind a scalable and distributed application
- Good working knowledge on Microservices architecture, behaviour, dependencies, scalability etc.
- Experience in deploying on Cloud platform like Azure or AWS
- Familiar and working experience with DevOps tools like Azure DEVOPS, Ansible, Jenkins, Terraform
- Engage with client business team managers and leaders independently to understand their requirements, help them structure their needs into data needs, prepare functional and technical specifications for execution and ensure delivery from the data team. This can be combination of ETL Processes, Reporting Tools, Analytics tools like SAS, R and alike.
- Lead and manage the Business Analytics team, ensuring effective execution of projects and initiatives.
- Develop and implement analytics strategies to support business objectives and drive data-driven decision-making.
- Analyze complex data sets to provide actionable insights that improve business performance.
- Collaborate with other departments to identify opportunities for process improvements and implement data-driven solutions.
- Oversee the development, maintenance, and enhancement of dashboards, reports, and analytical tools.
- Stay updated with the latest industry trends and technologies in analytics and data
- science.
- Lead the development, execution, and monitoring of advanced R models and analytics processes.
- Ensure data availability and integrity for model execution and analysis.
- Clean, manipulate, and visualize data from diverse sources to derive actionable insights.
- Maintain and document R scripts to ensure reproducibility and reliability of analytics processes.
- Collaborate with cross-functional teams to understand data requirements and deliver tailored solutions.
- Oversee the design and implementation of data visualization tools and dashboards using R packages such as ggplot2 and Shiny.
- Utilize databases and data manipulation packages like dplyr or data.table to handle large datasets efficiently.
- Provide guidance and support to business teams in running and interpreting model outputs.
- Maintain comprehensive documentation and logs of all activities, ensuring transparency and accountability.
- Drive continuous improvement initiatives to enhance analytics capabilities and operational efficiency.
- Communicate complex analytical findings and recommendations to non-technical stakeholders effectively.
We are seeking a skilled engineer to develop and maintain efficient and scalable data pipelines for 1M/sec+ events. The ideal candidate should leverage the right tools to deliver testable, maintainable, and modern data solutions.
Key Responsibilities:
Develop and maintain data pipelines for high-scale event processing.
Build solutions for ETL from diverse data sources using stable technologies.
Design data integrations and data quality frameworks.
Collaborate with team members towards shared product goals.
Troubleshoot and resolve data-related issues.
Essential Requirements:
Strong knowledge in Java & Spring framework.
Advanced SQL and experience with data warehousing solutions (BigQuery, Athena, Redshift).
Experience in building data pipeline architectures.
Working knowledge of message queuing, stream processing, and big data stores.
Experience with Apache Kafka and related technologies (Kafka-Connect, Spark, Nifi).
Containerization experience with Kubernetes and Docker.
Experience with cloud services (AWS, Google Cloud, Azure).
Job Description:
- Experience in Core Java, Spring Boot.
- Experience in microservices.
- Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
- Good knowledge of multi-threading
- Basic working knowledge of Unix/Linux
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
- Should be able to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
- Experience with C#, Angular, SQL, Azure, SQL. Back End: Rest APIs, Graph API
- Fluent in Azure & Microsoft .NET Stack, Web Applications & API Development, Service & Platform Management, E2E Life Cycle Management & DevOps.
- Build, Maintain and Integrate applications developed in different tech stack MVC, .Net, C#, SQL, Azure Stack, Angular
- Design and implement build-deployment-test automation.
- Microservices, Database design and implementation for relational and non-relational databases.
- Excellent interpersonal, verbal, and written communication skills.
The candidate is responsible for setting up, managing and optimizing advertising campaigns to ensure that campaigns reach full delivery and maximum performance.The candidate is extremely organized and will leverage checklists to ensure that nothing is forgotten and will strive for zero errors. The candidate is comfortable working in a fast-paced, demanding, metric driven, entrepreneurial environmentResponsibilities: • Create tag documentation for the advertiser • Create advertiser accounts • Manage and monitor advertiser campaign budget and bid setup.• Manage user segmentation setup in advertiser campaigns.• Pause/Resume advertiser campaigns.• Create and manage advertiser ad banner and coupon setups.• Manage third party tracking, brand safety, view-ability, fraud detection tags.• Manage Facebook ads.• Work with analyst, creative and technology teams closely on optimization and reporting• Assist advertisers to resolve technical problems and answer questions on materials to be delivered.
Who are we?
We are incubators of high-quality, dedicated software engineering teams for our clients. We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. Incubyte strives to find people who are passionate about coding, learning, and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim of bringing a product mindset into services.
What we are looking for
We’re looking to hire software craftspeople. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus get to work not only on programming languages but also on infrastructure technologies in the cloud.
What you’ll be doing
First, you will be writing tests. You’ll be writing self-explanatory, clean code. Your code will produce the same, predictable results, over and over again. You’ll be making frequent, small releases. You’ll be working in pairs. You’ll be doing peer code reviews.
You will work in a product team. Building products and rapidly rolling out new features and fixes.
You will be responsible for all aspects of development – from understanding requirements, writing stories, analyzing the technical approach to writing test cases, development, deployment, and fixes. You will own the entire stack from the front end to the back end to the infrastructure and DevOps pipelines. And, most importantly, you’ll be making a pledge that you’ll never stop learning!
Skills you need in order to succeed in this role
Most Important: Integrity of character, diligence and the commitment to do your best
Must Have: SQL, Databricks, (Scala / Pyspark), Azure Data Factory, Test Driven Development
Nice to Have: SSIS, Power BI, Kafka, Data Modeling, Data Warehousing
Self-Learner: You must be extremely hands-on and obsessive about delivering clean code
- Sense of Ownership: Do whatever it takes to meet development timelines
- Experience in creating end to end data pipeline
- Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
- Working experience in Databricks
- Strong in BI/DW/Datalake Architecture, design and ETL
- Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
- Experience in object-oriented programming, data structures, algorithms and software engineering
- Experience working in Agile and Extreme Programming methodologies in a continuous deployment environment.
- Interest in mastering technologies like, relational DBMS, TDD, CI tools like Azure devops, complexity analysis and performance
- Working knowledge of server configuration / deployment
- Experience using source control and bug tracking systems,
writing user stories and technical documentation
- Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
- Expertise in creating tables, procedures, functions, triggers, indexes, views, joins and optimization of complex
- Experience with database versioning, backups, restores and
- Expertise in data security and
- Ability to perform database performance tuning queries
Job Description:
We are seeking a detail-oriented Manual Tester to develop, document, and maintain comprehensive test plans and test cases based on software requirements. The ideal candidate will have experience in testing web and mobile applications, collaborating with Product Owners (POs), Development Leads, and providing effort estimations.
Key Responsibilities:
- Develop, document, and maintain comprehensive test plans and test cases based on software requirements.
- Collaborate with Product Owners (POs) and Development Leads to understand project requirements and provide effort estimations.
- Perform thorough manual testing of software applications, including functional, regression, integration, system, and user acceptance testing.
- Execute test cases and report results in a detailed and structured manner.
- Identify, document, and track defects and inconsistencies in the software using appropriate tools.
- Work closely with developers, product managers, and other stakeholders to understand project requirements and provide feedback on testability and quality.
- Communicate test progress, test results, and other relevant information to stakeholders in a clear and concise manner.
- Perform exploratory testing to identify areas of potential risk and areas not covered by formal test cases.
- Ensure all testing activities are conducted in accordance with the Agile methodology.
Qualifications:
- Proven experience in manual testing of web and mobile applications.
- Strong understanding of software testing methodologies, tools, and processes.
- Experience in developing and maintaining test plans and test cases.
- Excellent communication and collaboration skills, with the ability to interact effectively with developers, product managers, and other stakeholders.
- Ability to identify, document, and track defects and inconsistencies in the software.
- Experience working in an Agile development environment.
- Strong analytical and problem-solving skills.
- Detail-oriented with a commitment to quality.
Experience Level: Minimum 5 years
About Pace
Started in 1995 by first-generation entrepreneurs from IIMA & FMS Delhi, PACE has evolved from a fledgling NSE Broker to a premier boutique financial conglomerate over the last 25 years. Headquartered in New Delhi, we maintain offices at more than 300 locations in more than 75 cities across India, and our customer base is spread over 34 countries. We have also been consistently nominated as one of the best Investment Advisors in India by ICRA & CNBC. At PACE we are continuously innovating and building highly scalable backend systems and strategies that give a seamless experience to our customers. We are aggressively pursuing Fintech innovation now and working on the ambitious and potentially disruptive Fintech product ‘Pocketful’—a one-of-a-kind stock-broking platform.
About Pocketful (Fintech Division of Pace)
Founded by IIM-Ahmedabad, Yale, and Columbia alumni, Pocketful is a new-age Fintech broking platform, aimed at making financial markets accessible for all. We're constantly innovating and working on a disruptive platform. The team is highly skilled, young, and extremely hungry and we are looking for folks who fit this persona. We are backed by one of India's leading stock brokers Pace Stock Broking Services.
Overview:
- We are seeking an experienced Engineering Manager or Tech Lead to join our dynamic team in the fintech industry, focusing on stockbroking solutions. The ideal candidate will have a strong technical background and leadership experience, with proficiency in our tech stacks: React.js, Flutter, Golang, and MongoDB.
- Responsibilities:
- Lead and manage a team of engineers, providing guidance and mentorship.
- Oversee the design, development, and deployment of high-quality software solutions.
- Collaborate with cross-functional teams to define, design, and deliver new features.
- Ensure best practices in coding, architecture, and security are followed.
Requirements:
- Proven experience as an Engineering Manager or Tech Lead.
- Strong technical expertise in React.js, Flutter, Golang, and MongoDB.
- Excellent leadership and communication skills.
- Experience in the fintech industry, particularly in stockbroking, is a plus.
- Ability to work in a fast-paced, agile environment.
Qualifications:
- Minimum of 5+ years of experience in a technical management role.
- Bachelor's degree in Technology
- Strong project management skills, including the ability to prioritize and manage multiple tasks simultaneously.
- Excellent leadership and communication skills.
- Problem-solving and decision-making abilities.
- Results-oriented with a focus on delivering high-quality solutions.
Other details
Expected CTC: Depending on Experience & Skills
In-Person based out of Okhla, New Delhi
Culture
We’re still early-stage and we believe that the culture is an ever-evolving process. Help build the kind of culture you want in the organization. Best ideas come from collaboration and we firmly believe in that. We have a flat hierarchy, flexible with timings and we believe in continuous learning and adapting to changing needs. We want to scale fast but sustainably, keeping everyone’s growth in mind. We aim to make this job your last job.
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
We are looking for Senior Software Engineers responsible for designing, developing, and maintaining large scale distributed ad technology systems. This would entail working on several different systems, platforms and technologies.Collaborate with various engineering teams to meet a range of technological challenges. You will work with our product team to contribute and influence the roadmap of our products and technologies and also influence and inspire team members.
Experience
- 3 - 10 Years
Required Skills
- 3+ years of work experience and a degree in computer science or a similar field
- Knowledgeable about computer science fundamentals including data structures, algorithms, and coding
- Enjoy owning projects from creation to completion and wearing multiple hats
- Product focused mindset
- Experience building distributed systems capable of handling large volumes of traffic
- Fluency with Java, Vertex, Redis, Relational Databases
- Possess good communication skills
- Enjoy working in a team-oriented environment that values excellence
- Have a knack for solving very challenging problems
- (Preferred) Previous experience in advertising technology or gaming apps
- (Preferred) Hands-on experience with Spark, Kafka or similar open-source software
Responsibilities
- Creating design and architecture documents
- Conducting code reviews
- Collaborate with others in the engineering teams to meet a range of technological challenges
- Build, Design and Develop large scale advertising technology system capable of handling tens of billions of events daily
Education
- UG - B.Tech/B.E. - Computers; PG - M.Tech - Computer
What We Offer:
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A collaborative and inclusive work environment.
Salary budget upto 50 LPA or hike20% on current ctc
you can text me over linkedin for quick response
Client based at Pune location.
Responsibility:
Develop, test, and maintain robust, scalable software applications primarily using .NET
technologies.
Collaborate with cross-functional teams to define, design, and ship new features.
Participate in all phases of the development lifecycle including requirements gathering,
design, implementation, testing, and deployment.
Write clean, scalable code following best practices and coding standards.
Troubleshoot, debug, and resolve software defects and technical issues.
Stay updated with emerging technologies and industry trends
Requirements:
Minimum 2 years of experience in MS SQL.
Strong understanding of object-oriented programming principles.
Demonstrable knowledge of web technologies including HTML, CSS, JavaScript,
jQuery, AJAX, etc.
Proficiency in C#, ASP.NET MVC, and .NET Core.
Familiarity with LINQ or Entity Framework, and SQL Server.
Experience with architecture styles/APIs (REST, RPC).
Understanding of Agile methodologies.
Experience with ASP.NET MVC and .NET Core.
Familiarity with Windows Presentation Framework (WPF) is a plus.
Understanding of fundamental design principles for building scalable applications.
Knowledge of any JavaScript-based framework like Angular or React is preferred.
We are a technology company operating in the media space. We are the pioneers of robot journalism in India. We use the mix of AI-generated and human-edited content, across media formats, be it audio, video or text.
Our key products include India’s first explanatory journalism portal (NewsBytes), a content platform for developers (DevBytes), and a SaaS platform for content creators (YANTRA).
Our B2C media products are consumed by more than 50 million users in a month, while our AI-driven B2B content engine helps companies create text-based content at scale.
The company was started by IIT, IIM Ahmedabad alumni and Cornell University. It has raised institutional financing from well-renowned media-tech VC and a Germany-based media conglomerate.
We are hiring a talented DevOps Engineer with 3+ years of experience to join our team. If you're excited to be part of a winning team, we are a great place to grow your career.
Responsibilities
● Handle and optimise cloud (servers and CDN)
● Build monitoring tools for the infrastructure
● Perform a granular level of analysis and optimise usage
● Help migrate from a single cloud environment to multi-cloud strategy
● Monitor threats and explore building a protection layer
● Develop scripts to automate certain aspects of the deployment process
Requirements and Skills
● 0-2 years of experience as a DevOps Engineer
● Proficient with AWS and GCP
● A certification from relevant cloud companies
● Knowledge of PHP will be an advantage
● Working knowledge of databases and SQL
Python Developer at BeyondScale
BeyondScale is a technology company on a mission to democratise AI for small and medium-sized businesses (SMBs). We're building Sitara, an AI-powered ERP suite that is a suite of micro-apps designed specifically for the service sector. Imagine a pocket CRM, a pocket POS, and a suite of essential tools—all streamlined for simplicity and powered by intelligent automation.
The Opportunity:
We're looking for a passionate Python Developer to join our growing team and play a key role in shaping the future of AI-powered ERP. You'll be instrumental in building Sitara, a product poised to disrupt a massive market with high growth potential.
What You'll Do:
- Design, develop, and maintain efficient, reusable, and reliable Python code for our AI-powered ERP platform.
- Develop and integrate web APIs and interact with SQL databases (NoSQL experience a plus).
- Implement automation using object-oriented programming (OOP) principles, multiprocessing, and threading.
- Write clean, well-documented code and actively participate in testing and debugging.
- Leverage Git and modern development workflow practices to ensure a smooth development cycle.
- While not required, familiarity with generative AI concepts (LLMs, RAG) is a plus.
You're a Great Fit If You:
- Have 1+ years of relevant job experience working with Python
- Possess a strong foundation in computer science fundamentals.
- Are a team player with a collaborative spirit and a positive attitude.
- Enjoy learning new technologies and are eager to push boundaries.
- Have excellent communication skills, including the ability to effectively say no when needed.