50+ SQL Jobs in Pune | SQL Job openings in Pune
Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
Tariff Analyst
Experience: 2-6 Years
Location: Pune
Type: Full-time
About Digit88
Digit88 empowers digital transformation for innovative and high growth B2B and B2C SaaS companies as their trusted offshore software product engineering partner!
We are a lean mid-stage software company, with a team of 75+ fantastic technologists, backed by executives with deep understanding of and extensive experience in consumer and enterprise product development across large corporations and startups. We build highly efficient and effective engineering teams that solve real and complex problems for our partners.
With more than 50+ years of collective experience in areas ranging from B2B and B2C SaaS, web and mobile apps, e-commerce platforms and solutions, custom enterprise SaaS platforms and domains spread across Conversational AI, Chatbots, IoT, Health-tech, ESG/Energy Analytics, Data Engineering, the founding team thrives in a fast paced and challenging environment that allows us to showcase our best.
The Vision: To be the most trusted technology partner to innovative software product companies world-wide
The Opportunity
Digit88 is expanding the extended software product engineering team for its partner, a US-based Energy Analytics SaaS platform company. Our partner is building a suite of cloud-based business operation support platforms in the Utilities Rate Lifecycle space in the Energy sector/domain. This is a bleeding edge AI and Big Data platform that helps large energy utility companies in the US plan, manage, review and optimize their new product and rate design, billing, rate analysis, forecasting, and CRM. The candidate would be joining an existing team of product engineers in the US, China and Pune/India and help us establish an extended product engineering team at Digit88.
Job Profile
Digit88 is looking for a hands-on Data Analyst with expertise in advanced Excel and experience in SQL. They must be detail-focused, and have excellent problem solving skills, and quickly learn new tools & knowledge.
The role requires a knowledge of database concepts and provides an opportunity to drive positive energy consumption habits and catalyze a clean energy future. You must enjoy learning and building rapidly evolving products/platforms.
To be successful in this role, you should possess
- Overall Industry experience 2-4 years
- Bachelor’s Degree in analytical subject area. E.g., Engineering, Statistics.... etc.
- Proficient in advanced Excel functions and macros, involving complex calculations and pivots
- Exceptional Analytical, problem solving & Logical skills.
- Understanding of relational database concepts and familiar with SQL
- Demonstrable aptitude for Innovation & Problem solving.
- Good communication skills & ability to work across Cross-functional teams.
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Participates in sprint planning & other ceremonies, passionately works towards fulfilling the committed sprint goals.
- Knowledge of and ability to automate routine tasks using Python is a PLUS
Preferred Qualifications:
- Experience in Energy Industry & familiar with basic concepts of Utility (electrical/gas...) tariffs
- Experience & Knowledge with tools like; Microsoft Excel macros,
- Familiarity with writing programs using Python or Shell scripts.
- Passionate about working with data and data analyses.
- 1+ year experience in Agile methodology.
Roles and responsibilities
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Responsible for analysis of the energy utilization data for cost and usage patterns and derive meaningful patterns
- Responsible for Maintaining the tariff models with timely updates for price, logic or other enhancements as per client requirement.
- Assist delivery team in validating the input data received from client for modelling work.
- Responsible for Communicating & Co-ordinating with Delivery team
- Work with Cross functional teams to resolve issues in the Modelling tool.
- Build and deliver compelling demonstrations/visualizations of products
- Be a lifelong learner and develop your skills continuously
- Contribute to the success of a rapidly growing and evolving organization
Additional Project/Soft Skills:
- Should be able to work independently with India & US based team members.
- Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
- Strong sense of urgency, with a passion for accuracy and timeliness.
- Ability to work calmly in high pressure situations and manage multiple projects/tasks.
- Ability to work independently and possess superior skills in issue resolution.
Benefits/Culture @ Digit88:
- Comprehensive Insurance (Life, Health, Accident)
- Flexible Work Model
- Accelerated learning & non-linear growth
- Flat organization structure driven by ownership and accountability.
- Global Peers - Working with some of the best engineers/professionals globally from the likes of Apple, Amazon, IBM Research, Adobe and other innovative product companies
- Ability to make a global impact with your work, leading innovations in Conversational AI, Tele-Medicine, Healthcare and more.
You will work with a founding team of serial entrepreneurs with multiple successful exits to their credit. The learning will be immense just as will the challenges.
This is the right time to join us and partner in our growth!
Job Title: Senior ETL Developer (DataStage and SQL)
Location: Pune
Overview:
We’re looking for a Senior ETL Developer with 5+ years of experience in ETL development, strong DataStage and SQL skills, and a track record in complex data integration projects.
Responsibilities:
- Develop and maintain ETL processes using IBM DataStage and SQL for data warehousing.
- Write advanced SQL queries to transform and validate data.
- Troubleshoot ETL jobs, optimize performance, and ensure data quality.
- Document ETL workflows and adhere to coding standards.
- Lead and mentor junior developers, providing technical guidance.
- Collaborate with architects and analysts to deliver scalable solutions.
Qualifications:
- 5+ years in ETL development; 5+ years with IBM DataStage.
- Advanced SQL skills and experience with relational databases.
- Strong understanding of data warehousing and data integration.
- Experience in performance tuning and ETL process optimization.
- Team player with leadership abilities and excellent problem-solving skills.
Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop ML pipelines to support
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 3-5 years experience building production-quality software.
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
- Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
- Knowledge of MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- CI/CD experience( i.e. Jenkins, Git hub action,
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop scalable ML pipelines
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 5.5-9 years experience building production-quality software
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
- Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
- Expertise in MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- Experience developing CI/CD components for production ready ML pipeline.
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Team handling, problem solving, project management and communication skills & creative thinking
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Responsibility :
- Proficient in SQL
- 4+ yrs experience
- Banking domain expertise
- Excellent data analysis and problem-solving skills
- Attention to data details and accuracy
- ETL knowledge and experience
- Identifies, creates, and analyzes data, information, and reports to make recommendations and enhance organizational capability.
- Excellent Communication skills
- Experience in using Business Analysis tools and techniques
- Knowledge and understanding of various Business Analysis methodologies
- Attention to detail and problem-solving skills
- NP : Immediate joiner
- Design and Build Advanced Applications for the Android Platform
- Collaborate with Cross-Functional Teams to Define, Design and Ship New Features
- Troubleshoot and Fix Bugs in New and Existing Applications
- Continuously Discover, Evaluate and Implement New Development Tools
- Work With Outside Data Sources and APIs
- Knowledge of Android SDK, Java programming, Kotlin, Jetpack Compose, Realm
- Version Control, Clean Architecture
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Job Description:
- 3+ years of experience in Functional testing with good foundation in technical expertise
- Experience in Capital Markets/Investment Banking domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
Location:
Pune/Mumbai
About Wissen Technology:
· The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
· Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
· Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
· Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
· Globally present with offices US, India, UK, Australia, Mexico, and Canada.
· We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
· Wissen Technology has been certified as a Great Place to Work®.
· Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
· Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
· We have served clients across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com
JD:
- QA resources assigned to support the ETL(Datastage)-DWH, Bi reports test Execution of their assigned projects(with 6+ year of exp in ETL+ SQL)
- Create and be responsible for ensuring all testing Analyst work products for their assigned projects are delivered according to procedure and schedule.
- Determine automation test strategies for functional, regression and or smoke Testing for application/function-based changes.
- Interaction with Business users During SIT, UAT, signoff and project life cycle activities.
Key Duties and Responsibilities :
- Work with QA /project manager and assess project/ change artifacts to identify functional requirements and opportunities to support complex multi-application / platform projects
- Good understanding of Data warehouse and ETL process
- Good understanding in SQL/PL SQL and Database like Orcale, Hive
- Good Understanding of OBIEE/OAC, BI data, BI reports and data validation.
- Good Understanding of Actimize application - CDD,IFM,WLF,SAM
- good understanding of Anti-money laundering and fraud domain.
- Having understanding of Kafka and real-time data feed, Consumption and validation .
- Having a good understanding of case management , alert generation, Actimize tool and its workflow.
- Identify ,assign and monitoring the activities of test specialists to support of the multiple deliverables under their area of responsibility.
- Proactively communicate and collaborate with test terms and support partners
- Participate and drive test automation improvements with toll like ETL validator, Python script , selenium etc
- Create , manage the execution of the automation script according to test plan and schedule.
- Monitor overall test execution and manage contingency planning for test plan variances
- Identify and communicate risks during test planning and executive.
We will invite candidates for the selection process who meet the following criteria:
- Graduates/post graduates from the computer stream only (16 years of education – 10+2+4 or 10+3+3) having passed out in 2023/24
- 65% minimum marks in all semesters and cleared inonego
- Excellent communication skills - written and verbal in English
Further details on salary, benefits, location and working hours:
- Compensation: Rs. 20,000/- (inclusive of PF) stipend for first 3 months while on training and on successful completion Rs. 4LPA.
-Location: Pune
- Working hours: UK timing (8am – 5pm).
-Health insurance coverage of Rs 5L will be provided for the duration of your employment
Selection process
The selection process will consist of an aptitude assessment (1.5 hr), followed by a technical round (Java and SQL - MCQ) test (20 min) and a Java programming assignment. After clearing all the above rounds, the next round will be a personal interview.
We request that you only reply to this email message if you meet the selection criteria and agree with the terms and conditions. Please also mention which position you are applying for.
We will also ask you to answer the following screening questions if you wish to apply for any of these open vacancies.
Why shouldOnepointconsider you for an interview?
Which values are important for you at the workplace and why?
Where would you like to be in 2 to 3 years’ time in terms of your career?
Who are we looking for?
We are looking for a Senior Data Scientist, who will design and develop data-driven solutions using state-of-the-art methods. You should be someone with strong and proven experience in working on data-driven solutions. If you feel you’re enthusiastic about transforming business requirements into insightful data-driven solutions, you are welcome to join our fast-growing team to unlock your best potential.
Job Summary
- Supporting company mission by understanding complex business problems through data-driven solutions.
- Designing and developing machine learning pipelines in Python and deploying them in AWS/GCP, ...
- Developing end-to-end ML production-ready solutions and visualizations.
- Analyse large sets of time-series industrial data from various sources, such as production systems, sensors, and databases to draw actionable insights and present them via custom dashboards.
- Communicating complex technical concepts and findings to non-technical stakeholders of the projects
- Implementing the prototypes using suitable statistical tools and artificial intelligence algorithms.
- Preparing high-quality research papers and participating in conferences to present and report experimental results and research findings.
- Carrying out research collaborating with internal and external teams and facilitating review of ML systems for innovative ideas to prototype new models.
Qualification and experience
- B.Tech/Masters/Ph.D. in computer science, electrical engineering, mathematics, data science, and related fields.
- 5+ years of professional experience in the field of machine learning, and data science.
- Experience with large-scale Time-series data-based production code development is a plus.
Skills and competencies
- Familiarity with Docker, and ML Libraries like PyTorch, sklearn, pandas, SQL, and Git is a must.
- Ability to work on multiple projects. Must have strong design and implementation skills.
- Ability to conduct research based on complex business problems.
- Strong presentation skills and the ability to collaborate in a multi-disciplinary team.
- Must have programming experience in Python.
- Excellent English communication skills, both written and verbal.
Benefits and Perks
- Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you.
- Progressive leave policy for effective work-life balance.
- Get mentored by highly qualified internal resource groups and opportunity to avail industry-driven mentorship program, as we believe in empowering people.
- Multicultural peer groups and supportive workplace policies.
- Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work.
Hiring Process
- Call with Talent Acquisition Team: After application screening, a first-level screening with the talent acquisition team to understand the candidate's goals and alignment with the job requirements.
- First Round: Technical round 1 to gauge your domain knowledge and functional expertise.
- Second Round: In-depth technical round and discussion about the departmental goals, your role, and expectations.
- Final HR Round: Culture fit round and compensation discussions.
- Offer: Congratulations you made it!
If this position sparked your interest, apply now to initiate the screening process.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required:
- Experience in the manufacturing industry (metal industry is a plus)
- 4+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have:
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
- Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
- Benefits And Perks
- A culture that fosters innovation, creativity, continuous learning, and resilience
- Progressive leave policy promoting work-life balance
- Mentorship opportunities with highly qualified internal resources and industry-driven programs
- Multicultural peer groups and supportive workplace policies
- Annual workcation program allowing you to work from various scenic locations
- Experience the unique environment of a dynamic start-up
Why should you join TVARIT ?
Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.
If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
Client based at Pune location.
Responsibility:
Develop, test, and maintain robust, scalable software applications primarily using .NET
technologies.
Collaborate with cross-functional teams to define, design, and ship new features.
Participate in all phases of the development lifecycle including requirements gathering,
design, implementation, testing, and deployment.
Write clean, scalable code following best practices and coding standards.
Troubleshoot, debug, and resolve software defects and technical issues.
Stay updated with emerging technologies and industry trends
Requirements:
Minimum 2 years of experience in MS SQL.
Strong understanding of object-oriented programming principles.
Demonstrable knowledge of web technologies including HTML, CSS, JavaScript,
jQuery, AJAX, etc.
Proficiency in C#, ASP.NET MVC, and .NET Core.
Familiarity with LINQ or Entity Framework, and SQL Server.
Experience with architecture styles/APIs (REST, RPC).
Understanding of Agile methodologies.
Experience with ASP.NET MVC and .NET Core.
Familiarity with Windows Presentation Framework (WPF) is a plus.
Understanding of fundamental design principles for building scalable applications.
Knowledge of any JavaScript-based framework like Angular or React is preferred.
Immediate hiring for Fullstack Developer !!!
Skills Required
1. Frontend Technologies : Typescript,Reactjs
2. Backend Technologies: Nestjs/Express, Python
3. SQL is Must
Experience - 5 to 7 yrs
Notice Period -Immediate to 30 days max
Sr. Data Engineer (Data Warehouse-Snowflake)
Experience: 5+yrs
Location: Pune (Hybrid)
As a Senior Data engineer with Snowflake expertise you are a subject matter expert who is curious and an innovative thinker to mentor young professionals. You are a key person to convert Vision and Data Strategy for Data solutions and deliver them. With your knowledge you will help create data-driven thinking within the organization, not just within Data teams, but also in the wider stakeholder community.
Skills Preferred
- Advanced written, verbal, and analytic skills, and demonstrated ability to influence and facilitate sustained change. Ability to convey information clearly and concisely to all levels of staff and management about programs, services, best practices, strategies, and organizational mission and values.
- Proven ability to focus on priorities, strategies, and vision.
- Very Good understanding in Data Foundation initiatives, like Data Modelling, Data Quality Management, Data Governance, Data Maturity Assessments and Data Strategy in support of the key business stakeholders.
- Actively deliver the roll-out and embedding of Data Foundation initiatives in support of the key business programs advising on the technology and using leading market standard tools.
- Coordinate the change management process, incident management and problem management process.
- Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
- Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery
Knowledge Preferred
- Extensive knowledge and hands on experience with Snowflake and its different components like User/Group, Data Store/ Warehouse management, External Stage/table, working with semi structured data, Snowpipe etc.
- Implement and manage CI/CD for migrating and deploying codes to higher environments with Snowflake codes.
- Proven experience with Snowflake Access control and authentication, data security, data sharing, working with VS Code extension for snowflake, replication, and failover, optimizing SQL, analytical ability to troubleshoot and debug on development and production issues quickly is key for success in this role.
- Proven technology champion in working with relational, Data warehouses databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Highly Experienced in building and optimizing complex queries. Good with manipulating, processing, and extracting value from large, disconnected datasets.
- Your experience in handling big data sets and big data technologies will be an asset.
- Proven champion with in-depth knowledge of any one of the scripting languages: Python, SQL, Pyspark.
Primary responsibilities
- You will be an asset in our team bringing deep technical skills and capabilities to become a key part of projects defining the data journey in our company, keen to engage, network and innovate in collaboration with company wide teams.
- Collaborate with the data and analytics team to develop and maintain a data model and data governance infrastructure using a range of different storage technologies that enables optimal data storage and sharing using advanced methods.
- Support the development of processes and standards for data mining, data modeling and data protection.
- Design and implement continuous process improvements for automating manual processes and optimizing data delivery.
- Assess and report on the unique data needs of key stakeholders and troubleshoot any data-related technical issues through to resolution.
- Work to improve data models that support business intelligence tools, improve data accessibility and foster data-driven decision making.
- Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
- Manage and lead technical design and development activities for implementation of large-scale data solutions in Snowflake to support multiple use cases (transformation, reporting and analytics, data monetization, etc.).
- Translate advanced business data, integration and analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains; communicate results and educate others through design and build of insightful presentations.
- Exhibit strong knowledge of the Snowflake ecosystem and can clearly articulate the value proposition of cloud modernization/transformation to a wide range of stakeholders.
Relevant work experience
Bachelors in a Science, Technology, Engineering, Mathematics or Computer Science discipline or equivalent with 7+ Years of experience in enterprise-wide data warehousing, governance, policies, procedures, and implementation.
Aptitude for working with data, interpreting results, business intelligence and analytic best practices.
Business understanding
Good knowledge and understanding of Consumer and industrial products sector and IoT.
Good functional understanding of solutions supporting business processes.
Skill Must have
- Snowflake 5+ years
- Overall different Data warehousing techs 5+ years
- SQL 5+ years
- Data warehouse designing experience 3+ years
- Experience with cloud and on-prem hybrid models in data architecture
- Knowledge of Data Governance and strong understanding of data lineage and data quality
- Programming & Scripting: Python, Pyspark
- Database technologies such as Traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL)
Nice to have
- Demonstrated experience in modern enterprise data integration platforms such as Informatica
- AWS cloud services: S3, Lambda, Glue and Kinesis and API Gateway, EC2, EMR, RDS, Redshift and Kinesis
- Good understanding of Data Architecture approaches
- Experience in designing and building streaming data ingestion, analysis and processing pipelines using Kafka, Kafka Streams, Spark Streaming, Stream sets and similar cloud native technologies.
- Experience with implementation of operations concerns for a data platform such as monitoring, security, and scalability
- Experience working in DevOps, Agile, Scrum, Continuous Delivery and/or Rapid Application Development environments
- Building mock and proof-of-concepts across different capabilities/tool sets exposure
- Experience working with structured, semi-structured, and unstructured data, extracting information, and identifying linkages across disparate data sets
-Design and maintain efficient database solutions using RDBMS.
-Write complex SQL queries for data extraction and manipulation.
-Implement and optimize AWS services for scalable application deployment.
-Develop server-side logic using PHP and integrate front-end elements with JavaScript.
-Collaborate with teams to design, develop, and deploy web applications.
3-6 years of experience in Functional testing with good foundation in technical expertise.
Experience in Capital Markets domain is MUST.
Exposure to API testing tools like SoapUI and Postman.
Well versed with SQL
Hands on experience in debugging issues using Unix commands.
Basic understanding of XML and JSON structures
Knowledge of Finesse is good to have
What You’ll Be Doing:
- Subject Matter Expert on implementing APEX code and workflow code end-to-end in Salesforce, identifying dependencies in support of delivery, release, and change management.
- Expert in Salesforce data structures, object structures and partners of extending these things to provide highly customised data storage.
- Partner with all cross-functional teams to determine Salesforce CRM needs.
- Develop customised solutions within the Salesforce platform, including design, implementation, quality control, end-to-end testing plans in multiple environments (dev/test/stage/prod), troubleshooting and bug fixing.
- Own stability of the applications to Engineering standards including, incident management, monitoring, alerting and incident resolution for all aspects of the application.
- Create timelines, set expectations with RevOps and other cross-functional stakeholders.
- Collaborate with Engineering to ensure changes to event implementations and integrations are gracefully handled in the application.
- Maintain the security and data integrity of the application software.
- Research, Diagnose, and Monitor performance bottlenecks.
- Write documentation and provide technical training for cross-functional teams.
- Enjoy working with remote teams and people that have strong opinions
- Excellent verbal and written communication skills
- Excellent leadership and collaboration skills and the ability to lead large projects autonomously.
- Exhibits a high level of initiative and integrity and is empathetic to the needs of individuals across the organisation
- Enjoy working with a diverse group of people who have strong opinions.
- Comfortable not knowing answers, but resourceful and able to resolve issues.
- Strong problem solving and critical thinking skills.
- Strong project management, presentation and public speaking skills.
- Self-starter who’s comfortable with ambiguity, asking questions, and adept at shifting priorities
- Strong customer service skills with proven service mentality
- Strong in documenting business processes and communicating system requirements
Desired Traits, Qualifications, Education and Experience Equivalency
- Be able to make decisions, meet targets and work under pressure.
- Ability to set up, facilitate, and lead service and process improvement sessions with a range of business stakeholders.
- Ability to present to groups of mixed technical understanding.
- Adept at creating visuals to tell a story.
- Ability to build strong trust based relationships.
- Adept at shifting priorities while maintaining a high degree of organisation and control.
- Ability to manage multiple tasks and projects simultaneously.
- Ability to recommend actionable insights from projects and lead projects autonomously.
- Ability to travel to remote sites. (less than 25%).
- Demonstrated ability to work with geographically dispersed teams.
- Ability to learn and understand new ways of doing things.
- Ability to drive standards and best practices for a team or organisation.
- Ability to exercise good judgment within broadly defined practices and policies.
- Ability to lead a team towards high quality decisions when they have differing perspectives and ideas.
- Ability to provide business context for engineers as well as highlight technical challenges for non-engineers.
- Excellent decision-making skills and the ability to work in a collaborative environment, a team player.
Traits of a successful partnership:
- You have a passionate commitment to people and deep empathy for how supporting individuals leads to a stronger company culture.
- You’re Collaborative – It is expected that you will partner with various departments to ensure organisational needs are met and to develop strategic programs.
- You will serve as an Advocate – In the role, you will solicit and listen to concerns, and take an active role in resolving problems.
Preferred Experience/ Minimum Qualifications
- Graduate in Computer Science or a related field, or Professional Qualification/Salesforce Certification
- 4+ years of Salesforce developer experience
- 4 years experience in application development and/or software engineering.
- 2 years of proven continuous improvement analytical experience from a similar role, including project management and business analysis with an excellent understanding of continuous improvement concepts.
- Advanced knowledge of Salesforce CRM platform
- Proficiency with Salesforce Lightning Design System and Salesforce development lifecycle
- Demonstrated proficiency in SQL, Apex, LWC, Java and VisualForce
- Experience with Salesforce Sales, Service and Marketing clouds
- Experience developing customer-facing interfaces, including reports and dashboards, in Salesforce or other systems.
- Strong systems knowledge with the ability to effectively utilise DevOps tools such as Metadata API, GIT, Visual Studio Code, Jenkins, Jira, Confluence, etc.
- Strong understanding of Product development lifecycles.
- Experience with leading and coordinating cross-functional initiatives, conducting interviews and performing analyses to create business cases for projects.
- Experience performing live training sessions for internal and external stakeholders during system and process implementation.
- Must have strong communication skills and possess the ability to interact effectively with co-workers.
- Must have strong leadership skills.
- Additional Salesforce Certifications e.g. Certified Salesforce Administrator, Certified Salesforce Platform App Builder, Platform Developer II, JavaScript Developer I are highly desirable.
- Salesforce DevOps experience is highly desirable.
- Salesforce Developer Certifications will be given preference.
http://www.fictiv.com/" target="_blank">About Fictiv
Our Digital Manufacturing Ecosystem is transforming how the next rockets, self-driving cars, and life-saving robots are designed, developed and delivered to customers around the world.
This transformation is made possible through our technology-backed platform, our global network of manufacturing partners, and our people with deep expertise in hardware and software development.
We’re actively seeking potential teammates who can bring diverse perspectives and experience to our culture and company. We believe inclusion is the best way to create a strong, empathetic team. Our belief is that the best team is born from an environment that emphasizes respect, honesty, collaboration, and growth.
We encourage applications from members of underrepresented groups, including but not limited to women, members of the LGBTQ community, people of color, people with disabilities, and veterans.
Apply for this Job
What’s in it for you?
Opportunity To Unlock Your Creativity
Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.
Opportunity To Grow Your Career
There are plenty of sales jobs out there. The question is whether any of them will help you grow in your career? Will you be challenged by teammates to achieve your potential? Or are they roles that will ask you to do more of what you've already mastered. At Fictiv, you'll be surrounded by supportive teammates who will push you to be your best through their curiosity and passion.
Impact in this Role
The Business Applications team performs a critical function for Fictiv by managing software that is part of the framework used to conduct day-to-day business. This team writes code to provide customised application configuration and data structures, customise workflows, implement monitoring and alerting, secure and control access, and integrate business software with Fictiv platform. Functional areas supported include: Operations, Finance, Sales, Marketing, Engineering, Product, Architecture, and Customer Support. The Business Applications team partners closely with cross functional stakeholders to ensure that business systems software is properly secured and has managed change control to meet each of their needs.
This team sets the stage for ensuring Fictiv's business is delivering on KPIs and goals. This team provides inputs for Fictiv’s strategic decision making.
The Business Applications team implements business software across Fictiv’s departments.
As the Sr. Salesforce Analyst (Sr. Salesforce Application Developer, Specialist) you will partner with the RevOps team, to focus on changes and improvements to Salesforce functionality in support of business needs. You will work with the RevOps core team supporting Salesforce Sales and must be a strategic partner in determining best practices and efficiency gains as it relates to process improvements.
You will work with the Lead Salesforce Analyst to design and implement solutions that meet the technical requirements and business requires outlined by RevOps. You will also analyse project objectives, create customer workflows, and troubleshoot errors. This role requires extensive experience working with Salesforce CRM platforms, application development skills, and the ability to solve complex software problems.
You will be responsible for understanding requirements, defining design, working with other cross-functional teams to create implementation plans, and providing thought leadership for all solutions to meet and exceed RevOps expectations. You will write APEX code and any supporting code required to implement solutions. You will own the stability, security, data accuracy, uptime and issue resolution in Salesforce. You will provide testing plans, unit testing and documentation for all solutions and develop strong cross-functional relations with Product, Engineering and Infrastructure. You will be accountable for following all Fictiv’s Engineering guidelines.
The Above Mentioned budget is for
Experience with QE for distributed, highly scalable systems • Good understanding of OOPS concepts and strong programming skills in Java, Groovy or JavaScript • Hands on experience of working with at least one of GUI based test automation tools for desktop and/or mobile automation. Experience on multiple tools will be added advantage • Proficient in writing SQL queries • Familiarity with process of test automation tool selection & test approach • Experience in designing and development of automation framework and creation of scripts using best industry practices such as Page object model • Integrate test suites into the test management system and custom test harness • Familiar with implementation of design patterns, modularization, and user libraries for framework creation • Can mentor team as well as has short learning curve for new technology • Understands all aspects of Quality Engineering • Understanding of SOAP and REST principles • Thorough understanding of microservices architecture • In-depth hands-on experience of working with at least one API testing tool like RestAssured, SOAP UI, NodeJS • Hands-on experience working with Postman or similar tool • Hands-on experience in parsing complex JSON & XML and data validation using serialization techniques like POJO classes or similar • Hands-on experience in performing Request and Response Schema validation, Response codes and exceptions • Good Understanding of BDD, TDD methodologies and tools like Cucumber, TestNG, Junit or similar. • Experience in defining API E2E testing strategy, designing and development of API automation framework • Working experience on build tools Maven / Gradle, Git etc. • Experience in creating test pipeline – CI/CD
Position: Sr SDET
Experience: 5 years
Location: Pune (Amar tech park)
Mode: 5 days a week from office
What’s the role?
We are looking for a Senior SDET to contribute to the design and building of our software offerings. Our engineering team works with .NET in an Agile environment. We use Azure DevOps Pipelines and Release. The definition of 'DONE' includes writing automated tests so our full regression on releases will effortless. We strive to do thing right and not just band-aid the problems. Our management is engaged and looking for feedback on how we can become better, iteratively.
You will have the opportunity to…
- Participate in story refinement sessions to ensure the details and dependencies are well
- defined & understood with considerations for testing
- Collaborate with Product Owner and Developers as a team to deliver quality
- Writing and maintaining tests cases, executing, and perform ad-hoc testing with the end-user
- experience in mind
- Automate test cases based on priority before the close of the sprint
- Participate in code review to ensure commits are up to standards
- Monitor the Azure Release for regression bugs and or issues with environments
- Work with geo-distributed teams to coordinate testing of features
- Be vocal during Retrospective meetings and follow up on process improvements
- Managing quality and bugs reports in all stages of releases
Our ideal candidate will have…
- 5+ years of experience as an SDET
- 3+ years of experience with Selenium WebDriver and Grid
- 2+ years of experience of testing web API through code
- Strong experience of OOP design with C# programming skill
- Ability to write complex SQL queries for data validation
- Knowledge of test methodologies and their corresponding tools
- Ability to recognize errors and assess risks within applications and or processes
- Working knowledge with Visual Studio 2016+ and Git
- 1+ year of experience with of CI/CD pipelines
- An understanding of the ROI and risk for ah-hoc testing, test automation, code coverage and
- feature coverage
- A passion for design, development, and quality.
Dear Connections,
We are hiring! Join our dynamic team as a QA Automation Tester (Python, Java, Selenium, API, SQL, Git)! We're seeking a passionate professional to contribute to our innovative projects. If you thrive in a collaborative environment, possess expertise in Python, Java, Selenium, and Robot Framework, and are ready to make an impact, apply now! Wissen Technology is committed to fostering innovation, growth, and collaboration. Don't miss this chance to be part of something extraordinary.
Company Overview:
Wissen is the preferred technology partner for executing transformational projects and accelerating implementation through thought leadership and a solution mindset. It is a leading IT solutions and consultancy firm dedicated to providing innovative and customized solutions to global clients. We leverage cutting-edge technologies to empower businesses and drive digital transformation.
#jobopportunity #hiringnow #joinourteam #career #wissen #QA #automationtester #robot #apiautomation #sql #java #python #selenium
Job Description: Fullstack Developer
Are you a passionate developer looking to make a real difference? Do you thrive in a fast-paced startup environment and have a heart for empowering local artisans and small businesses? If so, we have the perfect opportunity for you! Join BharatGo, a dynamic tech startup on a mission to celebrate India's cultural heritage while revolutionizing the way artisans connect with the world.
Role & Responsibilities:
· Technical Leadership: Take charge of the fullstack development process, leading a team of frontend and backend developers. Provide guidance, mentorship, and technical leadership to the development team. Assign tasks, set expectations, and monitor performance.
· Develop high-performance Node.js applications leveraging the Express.js framework
· Analyze and optimize database queries in PostgreSQL and MySQL
· Integrate with AWS services to create a secure and stable backend
· Build pixel-perfect, smooth UIs for mobile applications using ReactJS
· API Integration: Utilize native APIs to facilitate deep integrations with web platforms
· Project Planning: Collaborate with Leaders and cross-functional teams to plan and execute development tasks, ensuring timely delivery of features and updates.
· Code Quality & Reviews: Maintain code quality standards and conduct regular code reviews to ensure the delivery of high-quality, error-free code.
· Performance Optimization: Identify and troubleshoot performance bottlenecks to ensure a seamless and lightning-fast platform experience.
· Cloud Management: Utilize your expertise in AWS cloud services for hosting, managing, and scaling our platform.
· Bug Fixing & Maintenance: Monitor platform performance and proactively address any issues or bugs to keep the platform running flawlessly.
· Continuous Learning: Stay at the forefront of technology trends and propose innovative solutions to enhance our platform's capabilities.
· Team Collaboration: Foster a collaborative work environment, working closely with designers, developers, and stakeholders to achieve project goals.
`
Requirements:
· Education: Bachelor's or Master's degree in Computer Science or a relevant field.
· Experience: You should have 3 to 5 years of hands-on experience in Fullstack development, with expertise in ReactJS, Node.js, API integration, and AWS Cloud Management.
· Technical Skills: Proficiency in ReactJS, RESTful APIs, Vcode, JavaScript, Android Studio, Node.js, Express JS, PostgreSQL, MySQL, and AWS cloud services.
· Experience with DevOps (CI/CD Pipelines / cloud migrations / logging and monitoring) on AWS
· Good with Git repositories, pull requests, code reviews
· Leadership Abilities: Strong leadership and communication skills to lead and mentor a team of developers effectively.
· Problem-Solving: Proven ability to troubleshoot and resolve complex technical issues.
· Startup Enthusiast: Embrace the fast-paced and dynamic environment of a startup, driven by a passion for making a positive impact.
Role Description
This is a full-time hybrid role as a GCP Data Engineer,. As a GCP Data Engineer, you will be responsible for managing large sets of structured and unstructured data and developing processes to convert data into insights, information, and knowledge.
Skill Name: GCP Data Engineer
Experience: 7-10 years
Notice Period: 0-15 days
Location :-Pune
If you have a passion for data engineering and possess the following , we would love to hear from you:
🔹 7 to 10 years of experience working on Software Development Life Cycle (SDLC)
🔹 At least 4+ years of experience in Google Cloud platform, with a focus on Big Query
🔹 Proficiency in Java and Python, along with experience in Google Cloud SDK & API Scripting
🔹 Experience in the Finance/Revenue domain would be considered an added advantage
🔹 Familiarity with GCP Migration activities and the DBT Tool would also be beneficial
You will play a crucial role in developing and maintaining our data infrastructure on the Google Cloud platform.
Your expertise in SDLC, Big Query, Java, Python, and Google Cloud SDK & API Scripting will be instrumental in ensuring the smooth operation of our data systems..
Join our dynamic team and contribute to our mission of harnessing the power of data to make informed business decisions.
Leading Product & Service Based
Title: Technical Analyst - OTM
Experience: 3-9 Years
Work Location: Mohali
Shift Timing: Rotational Shift 24x5
Notice Period: Immediate to 30 days Max
Key Skills: OTM, OBIEE, BI Publisher, Oracle ERP
Job Description:
The Oracle Transportation Management Technical Analyst will share the responsibility for design, implementation, and support of business solutions based on Emerson’s instance of Oracle Transportation Management commonly referred to as SCO (Supply Chain Optimization). The Technical Analyst utilizes expertise in Oracle Transportation Management to provide assistance in the ongoing implementation, enhancement, and support of SCO functionality.
Roles and Responsibilities:
- Provide support (e.g., break/fix, how to expertise, enhancements, monitoring, testing, troubleshooting) for the SCO application.
- Works collaboratively with Corporate Logistics and SCO IT Program/Project Managers to understand program requirements and assist with the evaluation of alternative solutions.
- Assist with program rollout activities, including business unit and trading partner on-boarding, project coordination, status reporting and communication to program management.
- Proactively monitors processes to identify trends; analyses/predicts trends and develops a long-range plan designed to resolve problems and prevent them from recurring to ensure high service levels.
- Ensures SCO system documentation is complete and maintained.
- Works effectively in a global highly matrixed team environment.
Skills & Experience Required:
- 4 to 8 years of IT experience, including implementation of Oracle Transportation Management.
- OTM Expert, both Functionally and technically (Setup configuration, Order Management, Shipment management, Financials, rates, master data, bulk planning parameters, VPDs, user configuration, screen set development, SQL queries, Tracking Events and working with CSV & XML files).
- Hands on experience with triage of day-to-day OTM systems issues and providing resolution on complex issues.
- Knowledge of Logistics management principles and processes.
- Broad knowledge and experience with various ERP systems. Working knowledge of Oracle eBusiness Suite (Procurement, Shipping, XML Gateway) is highly preferred.
- Working knowledge of BI Publisher, FTI/OAC, OBIEE and ETL.
- Good knowledge of EDI and any other Middleware systems.
- Strong customer service orientation with strong written and verbal communication skills, including comfort with presenting to diverse technical and non-technical audiences at all levels of the organization.
- Ability to multi-task and work within diverse multi-disciplinary global project teams.
- Detail-oriented with strong problem-solving skills.
- Comfortable with performing detailed data analysis to identify opportunities and gain higher level insight.
- Knowledge on GTM (Global Trade Management) will be a plus.
- Some travel might be required.
Education
- Bachelor’s degree in computer science, Information Systems, or another related field.
To be successful in this role, you should possess
- Overall Industry experience 2-6 years
- Bachelor’s Degree in analytical subject area. E.g., Engineering, Statistics.... etc.
- Proficient in advanced Excel functions and macros, involving complex calculations and pivots
- Exceptional Analytical, problem solving & Logical skills.
- Understanding of relational database concepts and familiar with SQL
- Demonstrable aptitude for Innovation & Problem solving.
- Good communication skills & ability to work across Cross-functional teams.
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Participates in sprint planning & other ceremonies, passionately works towards fulfilling the committed sprint goals.
- Knowledge of and ability to automate routine tasks using Python is a PLUS
Preferred Qualifications:
- Experience in Energy Industry & familiar with basic concepts of Utility (electrical/gas...) tariffs
- Experience & Knowledge with tools like; Microsoft Excel macros,
- Familiarity with writing programs using Python or Shell scripts.
- Passionate about working with data and data analyses.
- 1+ year experience in Agile methodology.
Roles and responsibilities
- Understands complex utility tariffs, rates and programs and converts these into a model.
- Responsible for analysis of the energy utilization data for cost and usage patterns and derive meaningful patterns
- Responsible for Maintaining the tariff models with timely updates for price, logic or other enhancements as per client requirement.
- Assist delivery team in validating the input data received from client for modelling work.
- Responsible for Communicating & Co-ordinating with Delivery team
- Work with Cross functional teams to resolve issues in the Modelling tool.
- Build and deliver compelling demonstrations/visualizations of products
- Be a lifelong learner and develop your skills continuously
- Contribute to the success of a rapidly growing and evolving organization
Additional Project/Soft Skills:
- Should be able to work independently with India & US based team members.
- Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
- Strong sense of urgency, with a passion for accuracy and timeliness.
- Ability to work calmly in high pressure situations and manage multiple projects/tasks.
- Ability to work independently and possess superior skills in issue resolution.
Title/Role: Python Django Consultant
Experience: 8+ Years
Work Location: Indore / Pune /Chennai / Vadodara
Notice period: Immediate to 15 Days Max
Key Skills: Python, Django, Crispy Forms, Authentication, Bootstrap, jQuery, Server Side Rendered, SQL, Azure, React, Django DevOps
Job Description:
- Should have knowledge and created forms using Django. Crispy forms is a plus point.
- Must have leadership experience
- Should have good understanding of function based and class based views.
- Should have good understanding about authentication (JWT and Token authentication)
- Django – at least one senior with deep Django experience. The other 1 or 2 can be mid to senior python or Django
- FrontEnd – Must have React/ Angular, CSS experience
- Database – Ideally SQL but most senior has solid DB experience
- Cloud – Azure preferred but agnostic
- Consulting / client project background ideal.
Django Stack:
- Django
- Server Side Rendered HTML
- Bootstrap
- jQuery
- Azure SQL
- Azure Active Directory
- Server Side Rendered/jQuery is older tech but is what we are ok with for internal tools. This is a good combination of late adopter agile stack integrated within an enterprise. Potentially we can push them to React for some discreet projects or pages that need more dynamism.
Django Devops:
- Should have expertise with deploying and managing Django in Azure.
- Django deployment to Azure via Docker.
- Django connection to Azure SQL.
- Django auth integration with Active Directory.
- Terraform scripts to make this setup seamless.
- Easy, proven to deployment / setup to AWS, GCP.
- Load balancing, more advanced services, task queues, etc.
ecommerce accelerator platform builds for brand's sales
Position = Java Developer
We are looking forward to hire a committed Java Developer with experience in building high performing, scalable, enterprise-grade applications. You will be part of our Engineering team that works on mission-critical applications. You will be managing Java/Java EE application development while providing expertise in the full software development lifecycle, from concept and design to testing.
You are required to:
Contributing in all phases of the development lifecycle.
Write well designed, testable & efficient code.
Ensure designs are in compliance with specifications.
Prepare and produce releases of software components.
Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review.
Technical Skills required
Java, Springboot, Microservices, Data structures & Algorithms, MySQL, NoSQL, Mongodb and Hibernate.
OUR CURRENT STACK
Backend: Spring (JAVA), Laravel (PHP), MySQL, NoSQL, NGINX Plus.
Frontend: Angular 5+ Ngrx/store 5
Infrastructure: Google cloud platform (App engine, CloudSQL, BigQuery, PubSub, Firebase Hosting), Scrapy Cloud, Pusher.io (websockets), Getstream.io, Filestack, Postmark app, AS2 Gateway, Google Cloud Endpoints Framework, MongoDB, Algolia, Memcache
Tools: Gitlab, Postman app, JIRA. Wondering what your Responsibilities would be? Technical Skills required O
You are where our search ends, if you hold:
B. Tech/ M. Tech or corresponding degree
Experience in the same role of almost 1-6 years
Experience with connecting backend and frontend services.
Exposure to consuming data through different interfaces (Web API's/Socket/ REST/ RESTFUL/ JSON/ XML).
Proficiency in Data Structures and Algorithms.
Understanding of web markup, including HTML 5 CSS.
Understanding of client-side scripting and JavaScript frameworks.
Ability to write clean, reusable and well documented code.
Proficient understanding of code versioning tools, such as Git.
Knowledge of API authentication techniques (Token, JWT, OAuth2) - desirable but not mandatory. (Experience with API Design will be a plus)
Fair spoken and written English Flexibility - Things change around here. FAST!
Other Inter-personal skills like self-motivation, persistency, patience and eagerness to learn and work independently.
Full Stack Developer Job Description
Position: Full Stack Developer
Department: Technology/Engineering
Location: Pune
Type: Full Time
Job Overview:
As a Full Stack Developer at Invvy Consultancy & IT Solutions, you will be responsible for both front-end and back-end development, playing a crucial role in designing and implementing user-centric web applications. You will collaborate with cross-functional teams including designers, product managers, and other developers to create seamless, intuitive, and high-performance digital solutions.
Responsibilities:
Front-End Development:
Develop visually appealing and user-friendly front-end interfaces using modern web technologies such as C# Coding, HTML5, CSS3, and JavaScript frameworks (e.g., React, Angular, Vue.js).
Collaborate with UX/UI designers to ensure the best user experience and responsive design across various devices and platforms.
Implement interactive features, animations, and dynamic content to enhance user engagement.
Optimize application performance for speed and scalability.
Back-End Development:
Design, develop, and maintain the back-end architecture using server-side technologies (e.g., Node.js, Python, Ruby on Rails, Java, .NET).
Create and manage databases, including data modeling, querying, and optimization.
Implement APIs and web services to facilitate seamless communication between front-end and back-end systems.
Ensure security and data protection by implementing proper authentication, authorization, and encryption measures.
Collaborate with DevOps teams to deploy and manage applications in cloud environments (e.g., AWS, Azure, Google Cloud).
Qualifications:
Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
Proven experience as a Full Stack Developer or similar role.
Proficiency in front-end development technologies like HTML5, CSS3, JavaScript, and popular frameworks (React, Angular, Vue.js, etc.).
Strong experience with back-end programming languages and frameworks (Node.js, Python, Ruby on Rails, Java, .NET, etc.).
Familiarity with database systems (SQL and NoSQL) and their integration with web applications.
Knowledge of web security best practices and application performance optimization.
at DeepIntent
Who We Are:
DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.
What You’ll Do:
We are looking for a Senior Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams
- Develop and standardized all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data based decisioning
- Optimize queries and data access efficiencies, serve as expert in how to most efficiently attain desired data points
- Build “mastered” versions of the data for Analytics specific querying use cases
- Help with data ETL, table performance optimization
- Establish formal data practice for the Analytics practice in conjunction with rest of DeepIntent
- Build & operate scalable and robust data architectures
- Interpret analytics methodology requirements and apply to data architecture to create standardized queries and operations for use by analytics teams
- Implement DataOps practices
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics specific objectives
- Collaborate with various business stakeholders, software engineers, machine learning engineers, analysts
- Operate between Engineers and Analysts to unify both practices for analytics insight creation
Who You Are:
- Adept in market research methodologies and using data to deliver representative insights
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases
- Deep SQL experience is a must
- Exceptional communication skills with ability to collaborate and translate with between technical and non technical needs
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high volume and velocity data
- Experience working with public clouds like GCP/AWS
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies
- Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown
- Proficient with SQL,Python or JVM based language, Bash
- Experience with any of Apache open source projects such as Spark, Druid, Beam, Airflow etc.and big data databases like BigQuery, Clickhouse, etc
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious
- Comfortable to work in EST Time Zone
Proven work experience as a Database Programmer, Database Developer, or similar role.
Strong proficiency in SQL and hands-on experience with PostgreSQL.
Solid understanding of relational database concepts and principles.
Experience in database design, schema modeling, and optimization.
Familiarity with database administration tasks, such as user management, backup and recovery, and performance tuning.
Ability to write efficient SQL queries, stored procedures, and functions.
Detail-oriented with a focus on data accuracy and integrity.
Familiarity with software development methodologies and programming languages is a plus.
at AxionConnect Infosolutions Pvt Ltd
Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.
Job Description: Core Java Developer
Company: Mobile Programming LLC
Location: Pune (Remote work available)
Salary: Up to 16 LPA
Position: Core Java Developer
Responsibilities:
- Design, develop, and maintain Java-based applications using Core Java and Spring Boot frameworks.
- Collaborate with cross-functional teams to deliver high-quality software solutions.
- Conduct code reviews and troubleshoot/debug complex issues.
- Optimize application performance and stay updated with industry trends.
Requirements:
- Minimum 6 years of hands-on Core Java development experience.
- Strong proficiency in Spring Boot framework.
- Solid understanding of OOP principles, web development (HTML/CSS/JavaScript), RESTful web services, and SQL.
- Experience with Git and problem-solving skills.
- Excellent communication skills and ability to work independently and as part of a team.
- Bachelor's or Master's degree in Computer Science or a related field.
Note: Immediate joiners required. Please include your resume and relevant code samples/projects when applying.
L2 Support
Location : Mumbai, Pune, Bangalore
Requirement details : (Mandatory Skills)
- Excell communication skills
- Production Support, Incident Management
- SQL ( Must have experience in writing complex queries )
- Unix ( Must have working experience on Linux operating system.
- Pearl/Shell Scripting
- Candidates working in the Investment Banking domain will be preferred
Title: Senior Business Analyst
Job Location: Hyderabad, Chennai, Cochin, Bangalore, Delhi, Kolkata, Pune
Experience: 9+ years
Important Note: We are hiring for a Techno-Functional Business Analyst with background as a Developer (preferably Java or .Net). We are looking for a candidate who is expertise in providing business solutions.
Business Analysts conduct market analyses, analyzing both product lines and the overall profitability of the business. In addition, they develop and monitor data quality metrics and ensure business data and reporting needs are met. Strong technology, analytical and communication skills are must-have traits
Essential Responsibilities:
- Evaluating business processes, anticipating requirements, uncovering areas for improvement, and developing and implementing solutions.
- Leading ongoing reviews of business processes and developing optimization strategies.
- Staying up to date on the latest process and IT advancements to automate and modernize systems.
- Conducting meetings and presentations to share ideas and findings.
- Understand the business requirements and documenting and translating into features / user stories.
- Ensure the system design is perfect as per the needs of the customer. Participating in functionality testing and user acceptance testing of the new features.
- Developing business artifacts in relate to the client business and conducting formal training sessions for the team.
- Acting as a coach on assigned projects and assignments; and providing business related direction and clarification to the developers and other project stakeholders.
- Develop a team culture where everyone thinks from end user perspective.
- Performing requirements analysis.
- Documenting and communicating the results of your efforts.
- Effectively communicating your insights and plans to cross-functional team members and management.
- Gathering critical information from meetings with various stakeholders and producing useful reports.
- Working closely with clients, technicians, and managerial staff.
- Providing leadership, training, coaching, and guidance to junior staff.
- Allocating resources and maintaining cost efficiency.
- Ensuring solutions meet business needs and requirements.
- Performing user acceptance testing.
- Managing projects, developing project plans, and monitoring performance.
- Updating, implementing, and maintaining procedures.
- Prioritizing initiatives based on business needs and requirements.
- Serving as a liaison between stakeholders and users.
- Managing competing resources and priorities.
- Monitoring deliverables and ensuring timely completion of projects.
Requirements:
- A bachelor’s degree in business or related field or an MBA.
- A minimum of 5 years of experience in business analysis or a related field.
- Should be from development background.
- should be a business solution expertise.
- Exceptional analytical and conceptual thinking skills.
- Insurance Domain experience is must.
- The ability to influence stakeholders and work closely with them to determine acceptable solutions.
- Advanced technical skills.
- Excellent documentation skills.
- Fundamental analytical and conceptual thinking skills.
- Experience creating detailed reports and giving presentations.
Responsibilities:
• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing
• Implementing Spark processing based ETL frameworks
• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
• Modifying the Informatica-Teradata & Unix based data pipeline
• Enhancing the Talend-Hive/Spark & Unix based data pipelines
• Develop and Deploy Scala/Python based Spark Jobs for ETL processing
• Strong SQL & DWH concepts.
Preferred Background:
• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs
• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
• Understanding of EDW system of business and creating High level design document and low level implementation document
• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document
• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Requirements:
· UIPath certification
· Proficient in UI Path Platform, Test Manage, Test Suite, with 6+ years of Experience on Test Automation
· Hands-on experience in building automated scripts using Low-Code No-Code Platform (UIpath)
· Experience on Testing SOAP or REST API
· Experience building data driven tests and frameworks for Web, Windows, and Micro services.
· Understanding of test methodologies (regression, functional, unit, integration, code coverage, performance, etc.)
· Experience building data driven tests and frameworks for Web, Windows, and Micro services.
· Designing and developing test automation frameworks and understanding of test automation design patterns and software testing principles.
· Familiarity with Relational Databases and SQL
· Bachelor's degree in computer science, engineering or related field
· Minimum of 7 years of experience in software testing and test automation
· Minimum of 5 years of experience in UIPath test automation
· Strong knowledge of test automation frameworks and tools
· Experience with continuous integration and continuous delivery (CI/CD) pipelines
· Ability to analyze and debug complex issues
· Excellent problem-solving skills
· Strong communication skills and ability to work collaboratively in a team environment
· Knowledge of agile methodologies
Flexibility
Need to be flexible with respect to working times, provide two hours overlap with IST (Central Time) and UK time on an ongoing basis.
If you are passionate about test automation and have experience with UIPath, we encourage you to apply for this exciting opportunity. We offer a competitive salary, excellent benefits, and opportunities for growth and development.
A global digital transformation consulting company
Automation Test Engineer
(UI Automation, API+UI Automation, UI+API+Mobile Automation)
Experience (4- 9) Yrs
N.P – Immediate - 25 Days
Location- Pan India
Skill Set-
Must Haves :
• Experience with QE for distributed, highly scalable systems
• Good understanding of OOPS concepts and strong programming skills in Java, Groovy or JavaScript
• Hands on experience of working with at least one of GUI based test automation tools for desktop and/or mobile automation. Experience on multiple tools will be added advantage
• Proficient in writing SQL queries
• Familiarity with process of test automation tool selection & test approach
• Experience in designing and development of automation framework and creation of scripts using best industry practices such as Page object model
• Integrate test suites into the test management system and custom test harness
• Familiar with implementation of design patterns, modularization, and user libraries for framework creation
• Can mentor team as well as has short learning curve for new technology
• Understands all aspects of Quality Engineering
• Understanding of SOAP and REST principles
• Thorough understanding of microservices architecture
• In-depth hands-on experience of working with at least one API testing tool like RestAssured, SOAP UI, NodeJS
• Hands-on experience working with Postman or similar tool
• Hands-on experience in parsing complex JSON & XML and data validation using serialization techniques like POJO classes or similar
• Hands-on experience in performing Request and Response Schema validation, Response codes and exceptions
• Good Understanding of BDD, TDD methodologies and tools like Cucumber, TestNG, Junit or similar.
• Experience in defining API E2E testing strategy, designing and development of API automation framework
• Working experience on build tools Maven / Gradle, Git etc.
• Experience in creating test pipeline – CI/CD Preferred: (Mostly for people being hired at the Senior Associate Career Stage)
• Possess domain knowledge to identify issues across those domains, understand their impact, and drive resolution [(familiar / expert in domains like retail banking, automobile, insurance, betting, food-markets, hotel industry, healthcare)
• Used /Exposure to automation tool for automating mobile applications
• Expertise in creating test automation frameworks, implementing and maintaining them on a project Experience in the modern agile practices such as BDD/Cucumber , Devops
• Knowledge and experience in service virtualization and tools like CA Lisa
• Hands-on knowledge of setting up PACT Broker and writing PACT tests is desirable
• Experience in test management tools like Xray & Zephyr and integration of test framework with these tools
• Understanding of commonly used software design patterns like Builder, Factory, Singleton and Façade. Test Management: Must Haves
• Able to estimate for low and medium complexity applications and have used at least one of the estimation techniques.
• Able to handle/oversight a small team ranging from 2 -5 people and can guide them during the complete SDLC cycle starting from test case creation till test closure activities
• Well-versed with the most of the activities in defect management process, can define/enhance the defect documentation and TAR lifecycle process independently
• Have expertise to enforce/adhere defect or other processes in the team
1. 7 to 12-14 years of experience as a Software Developer.
2. Experience in .Net technologies - C#, ASP.net is must.
3. Experience in developing REST based APIs.
4. Experience in building web user interface.
5. Good to have experience in developing single page apps using modern JS framework.
6. Skills in cross browser development.
7. Understanding of Agile methodology.
8. Excellent trouble shooting and communication skills.
9. Knowledge of Azure will be an added advantage.
JOB DESCRIPTION -
- Developing and implementing web-based applications using .Net technology and MS-SQL.
- Developing front end with highly responsive user interface concept using React concept.
- Identifying web-based user interaction.
- Participate in requirement analysis.
- Collaborate with internal teams and positive mindset to help and seek help.
- Documentation of application development changes and updates.
TECHNICAL SKILLS
- Expert in C#, ASP.Net, Web API, MS SQL Server.
- Good to have experience in React and workflows such as Redux or Flux, MongoDB.
- Knowledge of HTML, CSS and JavaScript.
- Familiar with source control system (GIT, SVN).
ESSENTIAL EXPERIENCE
1. 3.5 to 12-14 years of experience as a Software Developer.
2. Experience in .Net technologies - C#, ASP.net is must.
3. Experience in developing REST based PAPIs.
4. Experience in building web user interface.
5. Good to have experience in developing single page apps using modern JS framework.
6. Skills in cross browser development.
7. Understanding of Agile methodology.
8. Excellent trouble shooting and communication skills.
9. Knowledge of Azure will be an added advantage.
JOB DESCRIPTION
- Developing and implementing web-based applications using .Net technology and MS-SQL.
- Developing front end with highly responsive user interface concept using React concept.
- Identifying web-based user interaction.
- Participate in requirement analysis.
- Collaborate with internal teams and positive mindset to help and seek help.
- Documentation of application development changes and updates.
About Apexon:
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. For over 17 years, Apexon has been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving our clients’ toughest technology problems, and a commitment to continuous improvement. We focus on three broad areas of digital services: User Experience (UI/UX, Commerce); Engineering (QE/Automation, Cloud, Product/Platform); and Data (Foundation, Analytics, and AI/ML), and have deep expertise in BFSI, healthcare, and life sciences.
Apexon is backed by Goldman Sachs Asset Management and Everstone Capital.
To know more about us please visit: https://www.apexon.com/" target="_blank">https://www.apexon.com/
Responsibilities:
- C# Automation engineer with 4-6 years of experience to join our engineering team and help us develop and maintain various software/utilities products.
- Good object-oriented programming concepts and practical knowledge.
- Strong programming skills in C# are required.
- Good knowledge of C# Automation is preferred.
- Good to have experience with the Robot framework.
- Must have knowledge of API (REST APIs), and database (SQL) with the ability to write efficient queries.
- Good to have knowledge of Azure cloud.
- Take end-to-end ownership of test automation development, execution and delivery.
Good to have:
- Experience in tools like SharePoint, Azure DevOps
.
Other skills:
- Strong analytical & logical thinking skills. Ability to think and act rationally when faced with challenges.
About Apexon:
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. For over 17 years, Apexon has been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving our clients’ toughest technology problems, and a commitment to continuous improvement. We focus on three broad areas of digital services: User Experience (UI/UX, Commerce); Engineering (QE/Automation, Cloud, Product/Platform); and Data (Foundation, Analytics, and AI/ML), and have deep expertise in BFSI, healthcare, and life sciences.
Apexon is backed by Goldman Sachs Asset Management and Everstone Capital.
To know more about us please visit: https://www.apexon.com/
About the role:
- Experience: 7+ years of experience building modern web applications working in the fullstack
- Proficiency in TypeScript and JavaScript with a thorough understanding of React.js or Vue.js and their core principles preferred
- Implementing RESTful APIs using .Net/C# preferred with experience with .Net Core, .Net 5 or 6 a bonus!
- Experience with SQL and relational database design with MS SQL Server experience is added advantage
- Experience with NoSQL/document database technologies
- Experience writing automated unit tests in the full stack environment
- Knowledge of modern authentication and authorization mechanisms
- Familiarity with modern front-end and backend build pipelines and tools
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
- Experience with modern responsive web application design & development Familiarity with Node.js
- Experience with microservice architecture
- Experience using Git version control
- Experience with VS Code, Visual Studio, or other relevant development tools
- Familiarity with Scrum/Agile principles
- Strong communication skills
- Ability to understand business requirements and translate them into technical requirements.
Required skill set:
- Candidate must be good in JavaScript and have experience in at least one modern JavaScript framework such as Vue/Angular/React. But must be willing to work in Vue/React.
- Must have experience in .NET Framework. Good to have experience in .NET Core/.NET 5/6/7. But must be willing and capable enough to learn .NET Core.
- Should be able to work independently with minimum supervision.
- Must be good in programming concepts such as OOPS, Unit Tests, Web API, SQL, etc
Software QA Automation Engineer
Experience: 1+ year
Location: Pune, India
FlytBase is looking for passionate and hardworking Software QA Automation Engineers to join our rapidly growing team. As a Software QA Automation Engineer, you will be responsible for the planning and implementation of tests that prove the functional and non-functional requirements of the system.
About FlytBase
FlytBase is a deep-tech startup that provides hardware-agnostic software solutions to automate and scale drone operations. It is the world’s first Internet of Drones Platform (IoD) that allows seamless & cloud-connected deployment of intelligent drone fleets for a variety of business applications. The team comprises young engineers and designers from top-tier universities such as IIT-B, IIT-KGP, University of Maryland & Georgia Tech, and with deep expertise in drone technology, computer science, electronics, and robotics.
The company is headquartered in Silicon Valley, California, USA, and has R&D offices in Pune, India. Widely recognized as a pioneer in the commercial drone ecosystem, FlytBase won the Global NTT Data Innovation Contest in Tokyo, Japan. FlytBase was also awarded the TiE50 Top Startup award by TiE Silicon Valley.
Role and Responsibilities:
- Designing and developing test automation scripts for a web application (Angular) using test automation guidelines.
- Proficient in end-to-end testing for both Web and mobile applications.
- Build tools and frameworks to aid continuous delivery, deployment, and debugging.
- Supporting the development team and software engineers during the development and testing phase.
- Prepare defect reports and report test progress.
Experience/Skills:
- Experience in Automation Testing with Cypress, Selenium, or similar end-to-end testing tools.
- Experience with Regression, Smoke testing, API, and Backend testing.
- Experience with non-functional testing, performance, and vulnerability testing.
- Excellent knowledge of testing skills (design test plan and test strategy, writing and executing test cases, opening bugs, verifying bugs, etc).
- A team player, and fast learner with good interpersonal, verbal, and written communication skills.
Compensation:
This role comes with an annual CTC that is market competitive and depends on the quality of your work experience, degree of professionalism, culture fit, and alignment with the company’s long-term business strategy.
Perks:
- Fast-paced Startup culture
- Hacker mode environment
- Enthusiastic and approachable team
- Professional autonomy
- Company-wide sense of purpose
- Flexible work hours
- Informal dress code
Your key responsibilities
- Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
- Responsible for development, support, maintenance, and implementation of a complex project module
- Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
- Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
- Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
- complete reporting solutions.
- Preparation of HLD about architecture of the application and high level design.
- Preparation of LLD about job design, job description and in detail information of the jobs.
- Preparation of Unit Test cases and execution of the same.
- Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle
Skills and attributes for success
- Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
- Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
- Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
- Should have enough experience to work on Power Shell Scripting
- Able to guide the team through the development, testing and implementation stages and review the completed work effectively
- Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
- Primary owner of delivery, timelines. Review code was written by other engineers.
- Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
- Must have understanding of business intelligence development in the IT industry
- Outstanding written and verbal communication skills
- Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
- Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
- Should be able to orchestrate and automate pipeline
- Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark
To qualify for the role, you must have
- Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
- More than 6 years of experience in ETL development projects
- Proven experience in delivering effective technical ETL strategies
- Microsoft Azure project experience
- Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)
Ideally, you’ll also have
Graas uses predictive AI to turbo-charge growth for eCommerce businesses. We are “Growth-as-a-Service”. Graas is a technology solution provider using predictive AI to turbo-charge growth for eCommerce businesses. Graas integrates traditional data silos and applies a machine-learning AI engine, acting as an in-house data scientist to predict trends and give real-time insights and actionable recommendations for brands. The platform can also turn insights into action by seamlessly executing these recommendations across marketplace store fronts, brand.coms, social and conversational commerce, performance marketing, inventory management, warehousing, and last mile logistics - all of which impacts a brand’s bottom line, driving profitable growth.
Roles & Responsibilities:
Work on implementation of real-time and batch data pipelines for disparate data sources.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
- Build and maintain an analytics layer that utilizes the underlying data to generate dashboards and provide actionable insights.
- Identify improvement areas in the current data system and implement optimizations.
- Work on specific areas of data governance including metadata management and data quality management.
- Participate in discussions with Product Management and Business stakeholders to understand functional requirements and interact with other cross-functional teams as needed to develop, test, and release features.
- Develop Proof-of-Concepts to validate new technology solutions or advancements.
- Work in an Agile Scrum team and help with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production.
- Work on building intelligent systems using various AI/ML algorithms.
Desired Experience/Skill:
- Must have worked on Analytics Applications involving Data Lakes, Data Warehouses and Reporting Implementations.
- Experience with private and public cloud architectures with pros/cons.
- Ability to write robust code in Python and SQL for data processing. Experience in libraries such as Pandas is a must; knowledge of one of the frameworks such as Django or Flask is a plus.
- Experience in implementing data processing pipelines using AWS services: Kinesis, Lambda, Redshift/Snowflake, RDS.
- Knowledge of Kafka, Redis is preferred
- Experience on design and implementation of real-time and batch pipelines. Knowledge of Airflow is preferred.
- Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.
If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!
What You'll Do:
- Creating detailed design, working on development and performing code reviews.
- Implementing validation and support activities in line with architecture requirements
- Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
- Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
- Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
- Ownership of product/feature end-to-end for all phases from the development to the production.
- Ensuring the developed features are scalable and highly available with no quality concerns.
- Work closely with senior engineers for refining the and implementation.
- Management and execution against project plans and delivery commitments.
- Assist directly and indirectly in the continual hiring and development of technical talent.
- Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.
The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.
What You'll Need:
- A Bachelor's degree in Computer Science or related technical discipline.
- 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
- Fluency with Java, and Spring is good.
- Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
- Strong knowledge of Data Structures, Algorithms and CS fundamentals.
- Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
- Excellent analytical and reasoning skills
- Ability to learn new domains and deliver output
- Hands on Experience with the core AWS services
- Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)
- Expertise in at least one of the following:
- Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology
- Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached
- Distributed column store databases like Snowflake, Cassandra, or HBase
- Spark, Flink, Beam, or equivalent streaming data processing frameworks
- Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
- Experience building automations and CICD pipelines (integration, testing, deployment)
- Experience with Kubernetes would be a plus.
- Good understanding of working with distributed teams using Agile: Scrum, Kanban
- Strong interpersonal skills as well as excellent written and verbal communication skills
• Attention to detail and quality, and the ability to work well in and across teams
2. Design software and make technology choices across the stack (from data storage to application to front-end)
3. Understand a range of tier-1 systems/services that power our product to make scalable changes to critical path code
4. Own the design and delivery of an integral piece of a tier-1 system or application
5. Work closely with product managers, UX designers, and end users and integrate software components into a fully functional system
6. Work on the management and execution of project plans and delivery commitments
7. Take ownership of product/feature end-to-end for all phases from the development to the production
8. Ensure the developed features are scalable and highly available with no quality concerns
9. Work closely with senior engineers for refining and implementation
10. Manage and execute project plans and delivery commitments
11. Create and execute appropriate quality plans, project plans, test strategies, and processes for development activities in concert with business and project management efforts
THE IDEAL CANDIDATE WILL
- Engage with executive level stakeholders from client's team to translate business problems to high level solution approach
- Partner closely with practice, and technical teams to craft well-structured comprehensive proposals/ RFP responses clearly highlighting Tredence’s competitive strengths relevant to Client's selection criteria
- Actively explore the client’s business and formulate solution ideas that can improve process efficiency and cut cost, or achieve growth/revenue/profitability targets faster
- Work hands-on across various MLOps problems and provide thought leadership
- Grow and manage large teams with diverse skillsets
- Collaborate, coach, and learn with a growing team of experienced Machine Learning Engineers and Data Scientists
ELIGIBILITY CRITERIA
- BE/BTech/MTech (Specialization/courses in ML/DS)
- At-least 7+ years of Consulting services delivery experience
- Very strong problem-solving skills & work ethics
- Possesses strong analytical/logical thinking, storyboarding and executive communication skills
- 5+ years of experience in Python/R, SQL
- 5+ years of experience in NLP algorithms, Regression & Classification Modelling, Time Series Forecasting
- Hands on work experience in DevOps
- Should have good knowledge in different deployment type like PaaS, SaaS, IaaS
- Exposure on cloud technologies like Azure, AWS or GCP
- Knowledge in python and packages for data analysis (scikit-learn, scipy, numpy, pandas, matplotlib).
- Knowledge of Deep Learning frameworks: Keras, Tensorflow, PyTorch, etc
- Experience with one or more Container-ecosystem (Docker, Kubernetes)
- Experience in building orchestration pipeline to convert plain python models into a deployable API/RESTful endpoint.
- Good understanding of OOP & Data Structures concepts
Nice to Have:
- Exposure to deployment strategies like: Blue/Green, Canary, AB Testing, Multi-arm Bandit
- Experience in Helm is a plus
- Strong understanding of data infrastructure, data warehouse, or data engineering
You can expect to –
- Work with world’ biggest retailers and help them solve some of their most critical problems. Tredence is a preferred analytics vendor for some of the largest Retailers across the globe
- Create multi-million Dollar business opportunities by leveraging impact mindset, cutting edge solutions and industry best practices.
- Work in a diverse environment that keeps evolving
- Hone your entrepreneurial skills as you contribute to growth of the organization
Hands-on experience with Spark and SQL
Good to have java knowledge