


Strong knowledge in statistical and data mining techniques: GLM/Regression, Random Forest, Boosting, Trees, text mining, etc.
Sound Knowlegde querying databases and using statistical computer languages: R, Python, SQL, etc.
Strong understanding creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.

About Getinz
About
Connect with the team
Similar jobs

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
- Bachelor of Computer Science or Equivalent Education
- At least 5 years of experience in a relevant technical position.
- Azure and/or AWS experience
- Strong in CI/CD concepts and technologies like GitOps (Argo CD)
- Hands-on experience with DevOps Tools (Jenkins, GitHub, SonarQube, Checkmarx)
- Experience with Helm Charts for package management
- Strong in Kubernetes, OpenShift, and Container Network Interface (CNI)
- Experience with programming and scripting languages (Spring Boot, NodeJS, Python)
- Strong container image management experience using Docker and distroless concepts
- Familiarity with Shared Libraries for code reuse and modularity
- Excellent communication skills (verbal, written, and presentation)
Note: Looking for immediate joiners only.
Responsibilities:
- Collaborate with the development team to understand project requirements and provide input on testability.
- Design and develop comprehensive test plans and test cases for UI/ API, based on project requirements.
- Write and execute automated tests using JavaScript with Cypress or Playwright.
- Perform manual testing when necessary to ensure complete test coverage.
- Identify and document defects, and work closely with developers to resolve them.
- Conduct regression testing to ensure software quality and reliability.
- Collaborate with cross-functional teams to ensure that software meets quality standards.
- Participate in code reviews to ensure testability and maintainability of the codebase.
- Maintain and update test documentation as necessary.
- Stay up-to-date with industry trends and best practices in software testing.
- Continuously improve the testing process and suggest process enhancements.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
3-4 years of experience in software testing.
- Proficiency in writing test scripts using JavaScript.
- Experience with test automation tools such as Cypress or Playwright.
- Strong understanding of software testing methodologies and best practices.
- Excellent problem-solving and debugging skills.
- Strong attention to detail and commitment to quality.
- Good communication and collaboration skills.
- Eagerness to work from the office in Hyderabad.
Benefits:
- Office Gear allocation
- Mediclaim policy for self, spouse, kids, and parents
- Wellness policy for self
- PTO
- 5 days a week work
- Subscription reimbursements
- Reimbursements for self-learning and professional development

Job Description:
7+ years of experience as a Techno-Functional specialist supporting a finance organization with an emphasis on project management.
Must have 2-3 Project implementation and more than one support experience in Oracle EBS R12.
Expert level knowledge of Oracle Applications Framework, Oracle Workflow, Oracle Forms, Oracle Report, XML Report, PLSQL
Functional knowledge in financial module like, GL,AP, AR PO, Project is must Extensive use of oracle APIs and interfaces Experience in performance tuning of the code Experience in shell scripting.
Worked on Oracle Alert, AME, OAM ,DFF etc Good to have ADF experience Good to have ORDS experience Good to have exposure in cloud technology.
Responsibilities:
Act as a senior liaison between Business partners and IT communities Facilitate cross-functional teams through business process and application initiatives from requirements definition to prototyping, design, development, testing, training, and deployment.
Create technical designs to support business requirements.
Lead the development of technical solutions. Communicate project status and escalate issues to management. Develop and execute system test plans and scenarios.
Assist in preparing training materials for deployment of applications, modules, enhancements and bug fixes.
Assist in developing baseline datasets and reports to support testing of technical solutions.
Come Dive In
The DevOps Engineer will execute the tools and processes to enable DevOps.
Engage in and improve the whole lifecycle of services from inception and design through deployment, operation, and refinement to efficiently deliver high-quality solutions. The candidate should bridge the gap between Development and Operational teams, working with the development teams to meet acceptance criteria and gather and document the requirements. Candidates should be able to work in fast-paced,
multi-disciplinary environments.
As An DevOps Engineer, You Will
● Work in a dynamic, agile team environment developing excellent new applications.
● Participate in design decisions, including new technology research and prototyping
● Collaborate closely with other AWS engineers and architects, cloud engineers, support teams, and other stakeholders
● Promote great Kubernetes and AWS platform design and quality
● Innovate new ideas to evolve our applications and processes
● Continuously analyzing and evaluating our systems, products, and methods for potential improvements.
Mandatory Skills:
● Experience on Linux based infrastructure
● Experience in ECS - Amazon services*
● Should have hands-on containerized Services
● Must know about AWS CI/CD pipeline.
● Must know DevOps concepts and Agile principles
● Knowledge of Git, Docker, and Jenkins
● Knowledge of Infrastructure as Code.
● Experience in using Automation Tools
● Must have experience in Test Driven Development environment setup.
● Working knowledge of Docker and Kubernetes
We recognize that asking you to give 100% of yourself daily requires us to show you the love.
PERKS: what can we offer you?
● Bi-Yearly performance audits and appraisals
● The flexibility of working days/hours
● 5 working days/week (Mon to Fri) and added payout for working Saturday
● Recognition and Appreciation
● A plethora of industry exposure and self-growth opportunities
Visit our site: www.cedcoss.com
Started in 2015, this lifestyle and accessories startup has taken over the consumer electronics sector in India. Our client has a product range that includes an extensive catalog of headphones, speakers, travel accessories, and modern earphones. It believes in providing cutting edge electronic products stamped with durability and affordability.
The brand is associated with some of the major icons across categories and tie-ups with industries covering fashion, sports, and music, of course. The founders are Marketing grads, with vast experience in the consumer lifestyle products and other major brands. With their vigorous efforts toward quality and marketing, they have been able to strike a chord with major E-commerce brands and even consumers.
- Designing of embedded systems for instrumentation
- Software: Participating in architecture definition, coding, testing, debugging and documentation
- Hardware: Executing PCB design , schematics, component selection, simulation, layout and troubleshooting
- Diagnosing and resolving hardware and software issues
- Ensuring Creation, Specification and Implementation of test equipment and code
- Enhancing existing and/ or developing new hardware designs/ technology that enables the launch of new products.
- Providing technical solutions, following defined processes & leading in hardware development & technology innovations.
- Reviewing project deliverables to ensure quality levels are met, and product documentation
- Reviewing technical concepts of all electronics projects including equipment specification and design
What you need to have:
- Willing to work in a consultant role.
- Exceptionally good knowledge in components/ Sensors Selection.
- EXP in Schematic and Layout EXP in Design & Validation Testing
- Excellent communication skills and Coordinating with distributors
- EXP on all stages of Hardware Product Development Lifecycle.
- Proven experience in embedded systems design
- Must have good knowledge of Bluetooth, BLE, Wi-Fi standards
- Candidates having experience in developing wearables and IoT systems.
LemmaTree aims to empower individuals and organizations with control of their verifiable data. In doing so, it seeks to inspire the building of disruptive businesses striving for transparency and portability within the digital credentials and identities space. LemmaTree comprises three businesses -https://www.affinidi.com/about-us" target="_blank"> Affinidi, https://www.trustana.com/" target="_blank">Trustana, and https://www.goodworker.in/" target="_blank">GoodWorker, all of which are seeded and funded by Temasek, a global investment company headquartered in Singapore.
Affinidi is a core technology company focused on digital identity. We empower institutions to issue verifiable credentials to individuals and businesses, who gain ownership and control of their data. We partner with governments and stakeholders within the travel and aviation, healthcare, and technology space to build a trusted ecosystem that enables safe travels across the globe through https://www.travel.affinidi.com/">Affinidi Travel. We also deploy decentralized technology and verifiable credentials to global businesses within financial services, including financial institutions, regulators, fintechs, and more, to redefine the way they serve organizations and individuals, through https://sg.linkedin.com/showcase/affinidi-finnovate">Affinidi Finnovate. Affinidi Finnovate is powering chekFIN, a decentralized credentials platform that allows financial institutions to obtain verified credentials about fintech. Affinidi Finnovate also enables organizations to issue digital certificates which can be verified and shared online.
What you’ll do:
- Define and execute the recruitment strategy for our India hub
- Directly source high caliber technical talent using tools like LinkedIn, GitHub, StackOverflow to name a few
- Manage the end of the end recruitment process from direct sourcing, interviewing, and offer negotiation
- Partner with a variety of hiring managers to create robust interview processes
- Articulate company culture and values throughout recruiting process
- Use data and metrics to measure our performance and provide insights on how we can improve our recruiting efforts
- Build talent pools of candidates for key disciplines like Engineering and Product
- Develop and implement a delightful candidate experience
- Partner with our global Talent Team and work on key strategic projects
You should apply if:
- You’ve got extensive experience as an internal recruiter in a high growth tech company/startup
- You have strong knowledge of the talent market in India and know where to source high caliber candidates from
- Proven track record in direct sourcing for candidates using a variety of tools
- You are confident managing and advising stakeholders on how to approach building their teams
- Experienced in building and utilizing a variety of recruiting tools and channels
- Strong ability to work in a fast-paced, dynamic work environment
- Strong ability to work in a team environment
- Excellent written and verbal communication
Logistics:
The interview process for this role is:
- 30m interview with one of our Talent Team or Recruitment Partners
- 1h interview with Operations Lead
- Final interview with our Leadership Team
We can be flexible with the structure of someone's circumstances or timescales require it for good reason, just let us know!
Please reach out if you have any specific requirements so we can be as accommodating as possible for you.
Responsibilities / Requirements:
● Be hands-on implicated in the design and implementation of NestJs REST APIs.
● Work with DevOps engineers to scale and optimise NestJs micro-services.
● Must have an excellent understanding of how the web works.
● Extensive knowledge of OOP, Design Patterns, and SOLID Principles.
● Familiar with modern engineering practices: coding standards, code reviews, continuous deployment, automated testing.
● Must be willing to constantly learn new things.
● Knowledge of algorithms and data structures.
Technologies:- • Nestjs • TypeScript 3. MySQL • Docker • Kafka
Knowledge of NestJS would be preferable.



