Required Qualifications:
∙Bachelor’s degree in computer science, Information Technology, or related field, or equivalent experience.
∙5+ years of experience in a DevOps role, preferably for a SaaS or software company.
∙Expertise in cloud computing platforms (e.g., AWS, Azure, GCP).
∙Proficiency in scripting languages (e.g., Python, Bash, Ruby).
∙Extensive experience with CI/CD tools (e.g., Jenkins, GitLab CI, Travis CI).
∙Extensive experience with NGINX and similar web servers.
∙Strong knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
∙Familiarity with infrastructure-as-code tools (e.g. Terraform, CloudFormation).
∙Ability to work on-call as needed and respond to emergencies in a timely manner.
∙Experience with high transactional e-commerce platforms.
Preferred Qualifications:
∙Certifications in cloud computing or DevOps are a plus (e.g., AWS Certified DevOps Engineer,
Azure DevOps Engineer Expert).
∙Experience in a high availability, 24x7x365 environment.
∙Strong collaboration, communication, and interpersonal skills.
∙Ability to work independently and as part of a team.
About Fintrac Global services
Similar jobs
Desired Candidate Profile:
- 6+ years of experience delivering enterprise-wide technical projects. 2-3 years of experience of which should have been spent managing, growing and retaining talent within teams
- Hands-on experience programming with RESTful APIs, Python, deployment and run time environments like Docker, Kubernetes
- Hands on front end experience in HTML, Javascript, React, and CSS.
- Deeper understanding of the network layer, HTTP, HTTPS, cookies, local storage.
- Ability to solve complex integration and product questions, looking at issues analytically.
- Organize team’s scrums, and prepare roadmap for dev tools
- Track integration success and ensure deadlines are met. Provide input to a scalable technical integration and account management process.
Desired Skills:
- Bachelor’s degree in Computer Science or relevant experience with other degrees.
- Excellent communication skills, good visual design sense, in-depth experience in developing web-based applications.
- Experience leading multi engineer projects and mentoring junior engineers.
- Ability to work in a fast-paced environment, multi-task, and perform under pressure, possessing a high level of energy, efficiency, and flexibility.
- Excellent business writing, presentation and communication skills.
Responsibilities:
- Collaborating with data scientists and machine learning engineers to understand their requirements and design scalable, reliable, and efficient machine learning platform solutions.
- Building and maintaining the applications and infrastructure to support end-to-end machine learning workflows, including inference and continual training.
- Developing systems for the definition deployment and operation of the different phases of the machine learning and data life cycles.
- Working within Kubernetes to orchestrate and manage containers, ensuring high availability and fault tolerance of applications.
- Documenting the platform's best practices, guidelines, and standard operating procedures and contributing to knowledge sharing within the team.
Requirements:
- 3+ years of hands-on experience in developing and managing machine learning or data platforms
- Proficiency in programming languages commonly used in machine learning and data applications such as Python, Rust, Bash, Go
- Experience with containerization technologies, such as Docker, and container orchestration platforms like Kubernetes.
- Familiarity with CI/CD pipelines for automated model training and deployment. Basic understanding of DevOps principles and practices.
- Knowledge of data storage solutions and database technologies commonly used in machine learning and data workflows.
Job Tittle:- AWS Solution Architect
CTC:- 22 LPA
Experience:- 7+Years
Location:- Goregaon, Thane
Working Mode:- WFO
Primary Skills
- Proven experience as an AWS Solution Architect or a similar role, with a strong understanding of AWS services and architecture.
- AWS certifications such as AWS Certified Solutions Architect - Associate or AWS Certified Solutions Architect - Professional.
- Strong knowledge of cloud computing concepts, distributed systems, and architectural best practices.
- Proficiency in designing and implementing AWS solutions for computing, storage, networking, database, and security.
- Familiarity with automation and orchestration tools such as AWS CloudFormation or Terraform.
Qualification:- BCA, MCA, B.Tech, M.Tech
Responsibilities:-
- Collaborate with clients and internal teams to understand business requirements and design effective solutions using AWS services.
- Assess infrastructure needs and design scalable and resilient architectures that leverage AWS best practices.
- Assist with the migration of on-premises applications and infrastructure to AWS, ensuring minimal downtime and maximum efficiency.
- Optimize application performance and cost by analyzing and fine-tuning AWS resources.
- Ensure security and compliance of AWS solutions by implementing appropriate security controls, encryption, and access management.
- Create and maintain technical documentation, including architecture diagrams, deployment guides, and best practices.
- Provide guidance and support to development teams in implementing AWS services and integrating them into applications.
- Stay updated with the latest AWS services, features, and industry trends, and evaluate their applicability to the organization's needs.
- Collaborate with cross-functional teams, including developers, operations, and project managers, to deliver successful AWS solutions.
Location - https://www.linkedin.com/feed/hashtag/?keywords=mohali&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Mohali
https://www.linkedin.com/feed/hashtag/?keywords=5daysworking&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#5daysworking
Job Description
- https://www.linkedin.com/feed/hashtag/?keywords=activedirectory&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#ActiveDirectory Domain, https://www.linkedin.com/feed/hashtag/?keywords=grouppolicies&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#GroupPolicies, https://www.linkedin.com/feed/hashtag/?keywords=domaincontroller&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Domaincontroller migration and upgrades.
- File and Print sharing, https://www.linkedin.com/feed/hashtag/?keywords=ntfs&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#NTFS permissions. https://www.linkedin.com/feed/hashtag/?keywords=fileserver&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#FileServer https://www.linkedin.com/feed/hashtag/?keywords=migrations&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#migrations.
- https://www.linkedin.com/feed/hashtag/?keywords=microsoftexchange&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#MicrosoftExchange or https://www.linkedin.com/feed/hashtag/?keywords=office365&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Office365 messaging, https://www.linkedin.com/feed/hashtag/?keywords=outlook&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Outlook Configurations.
- Knowledge of Data https://www.linkedin.com/feed/hashtag/?keywords=backups&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Backups, Backup Strategies, Experience on https://www.linkedin.com/feed/hashtag/?keywords=backuptools&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#backuptools will be an additional advantage.
- Basic knowledge of https://www.linkedin.com/feed/hashtag/?keywords=routers&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Routers, https://www.linkedin.com/feed/hashtag/?keywords=firewalls&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Firewalls, NAT, https://www.linkedin.com/feed/hashtag/?keywords=vpn&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#VPN configuration https://www.linkedin.com/feed/hashtag/?keywords=sonicwalll&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Sonicwalll preferable.
- Knowledge and working experience on https://www.linkedin.com/feed/hashtag/?keywords=ticketingsystems&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#TicketingSystems & https://www.linkedin.com/feed/hashtag/?keywords=remoteadministration&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#RemoteAdministration tools.
- Good https://www.linkedin.com/feed/hashtag/?keywords=desktoptroubleshooting&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#DesktopTroubleshooting experience.
- https://www.linkedin.com/feed/hashtag/?keywords=antivirus&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#AntiVirus installations and https://www.linkedin.com/feed/hashtag/?keywords=troubleshooting&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Troubleshooting.
- Knowledge of https://www.linkedin.com/feed/hashtag/?keywords=dhcp&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#DHCP , https://www.linkedin.com/feed/hashtag/?keywords=dns&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#DNS Management.
- Ticketing tool and https://www.linkedin.com/feed/hashtag/?keywords=rmm&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#RMM tool https://www.linkedin.com/feed/hashtag/?keywords=labtech&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Labtech, https://www.linkedin.com/feed/hashtag/?keywords=kaseya&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Kaseya, https://www.linkedin.com/feed/hashtag/?keywords=autotask&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6830485845649215489">#Autotask (Experience preferred).
Should be having overall years of experience of 7-12 years and having Team management ability.
Hands-on experience of minimum 5 years into Golang.
Proficient exp of minimum 2-3 years into DevOps* - end to end project implementation.
Strong expertise on DevOps concepts like Continuous Integration (CI), Continuous delivery (CD) and Infrastructure as Code, Cloud deployments.
Good experience of minimum 1-3 year into "Terraform".
Good experience of min 6months - 1 year into frontend technology like Vue.js
Good experience or working knowledge in AWS, Docker, Kubernetes, Helm etc.
Job Description
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
Qualifications:
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
Introduction
http://www.synapsica.com/">Synapsica is a https://yourstory.com/2021/06/funding-alert-synapsica-healthcare-ivycap-ventures-endiya-partners/">series-A funded HealthTech startup founded by alumni from IIT Kharagpur, AIIMS New Delhi, and IIM Ahmedabad. We believe healthcare needs to be transparent and objective while being affordable. Every patient has the right to know exactly what is happening in their bodies and they don't have to rely on cryptic 2 liners given to them as a diagnosis.
Towards this aim, we are building an artificial intelligence enabled cloud based platform to analyse medical images and create v2.0 of advanced radiology reporting. We are backed by IvyCap, Endia Partners, YCombinator and other investors from India, US, and Japan. We are proud to have GE and The Spinal Kinetics as our partners. Here’s a small sample of what we’re building: https://www.youtube.com/watch?v=FR6a94Tqqls">https://www.youtube.com/watch?v=FR6a94Tqqls
Your Roles and Responsibilities
Synapsica is looking for a Principal AI Researcher to lead and drive AI based research and development efforts. Ideal candidate should have extensive experience in Computer Vision and AI Research, either through studies or industrial R&D projects and should be excited to work on advanced exploratory research and development projects in computer vision and machine learning to create the next generation of advanced radiology solutions.
The role involves computer vision tasks including development customization and training of Convolutional Neural Networks (CNNs); application of ML techniques (SVM, regression, clustering etc.), and traditional Image Processing (OpenCV, etc.). The role is research-focused and would involve going through and implementing existing research papers, deep dive of problem analysis, frequent review of results, generating new ideas, building new models from scratch, publishing papers, automating and optimizing key processes. The role will span from real-world data handling to the most advanced methods such as transfer learning, generative models, reinforcement learning, etc., with a focus on understanding quickly and experimenting even faster. Suitable candidate will collaborate closely both with the medical research team, software developers and AI research scientists. The candidate must be creative, ask questions, and be comfortable challenging the status quo. The position is based in our Bangalore office.
Primary Responsibilities
- Interface between product managers and engineers to design, build, and deliver AI models and capabilities for our spine products.
- Formulate and design AI capabilities of our stack with special focus on computer vision.
- Strategize end-to-end model training flow including data annotation, model experiments, model optimizations, model deployment and relevant automations
- Lead teams, engineers, and scientists to envision and build new research capabilities and ensure delivery of our product roadmap.
- Organize regular reviews and discussions.
- Keep the team up-to-date with latest industrial and research updates.
- Publish research and clinical validation papers
Requirements
- 6+ years of relevant experience in solving complex real-world problems at scale using computer vision-based deep learning.
- Prior experience in leading and managing a team.
- Strong problem-solving ability
- Prior experience with Python, cuDNN, Tensorflow, PyTorch, Keras, Caffe (or similar Deep Learning frameworks).
- Extensive understanding of computer vision/image processing applications like object classification, segmentation, object detection etc
- Ability to write custom Convolutional Neural Network Architecture in Pytorch (or similar)
- Background in publishing research papers and/or patents
- Computer Vision and AI Research background in medical domain will be a plus
- Experience of GPU/DSP/other Multi-core architecture programming
- Effective communication with other project members and project stakeholders
- Detail-oriented, eager to learn, acquire new skills
- Prior Project Management and Team Leadership experience
- Ability to plan work and meet the deadline
Responsibilities:
* 3+ years of Data Engineering Experience - Design, develop, deliver and maintain data infrastructures.
* SQL Specialist – Strong knowledge and Seasoned experience with SQL Queries
* Languages: Python
* Good communicator, shows initiative, works well with stakeholders.
* Experience working closely with Data Analysts and provide the data they need and guide them on the issues.
* Solid ETL experience and Hadoop/Hive/Pyspark/Presto/ SparkSQL
* Solid communication and articulation skills
* Able to handle stakeholders independently with less interventions of reporting manager.
* Develop strategies to solve problems in logical yet creative ways.
* Create custom reports and presentations accompanied by strong data visualization and storytelling
We would be excited if you have:
* Excellent communication and interpersonal skills
* Ability to meet deadlines and manage project delivery
* Excellent report-writing and presentation skills
* Critical thinking and problem-solving capabilities
• Strong knowledge of SQL and ETL Testing
• Extensive experience in ETL/ Data warehouse backend testing and BI Intelligence reports testing
• Hands-on back-end testing skills and strong RDBMS and testing methodologies.
• Expertise in test management tools and defect tracking tools i.e HP Quality Center, Jira
• Proficient experience of working on SDLC & Agile Methodology
•Excellent Knowledge of Database Systems Vertica /Oracle/ Teradata
• Knowledge in security testing will be an added advantage.
• Experience in Business Intelligence testing in various reports Using Tableau
• Strong comprehension, analytical, and problem-solving skills
•Good interpersonal and communication skills, quick learner, and good troubleshooting capabilities.
• Good knowledge of Python Programming language.
• Working knowledge of AWS
About this role
We are seeking a seasoned DBA to join our team. You will have extensive experience in a mission-critical environment as a database administrator over a successful career. You will primarily work on the Supporting our Databases, AWS RDS (MySQL), and MongoDB/ Cassandra.
What You’ll Do
- Provide leadership in database scaling & monitoring. Maintain runbooks for our 24x7 SRE team.
- Assist in troubleshooting and resolution of database issues. Perform database upgrades and patches.
- Define and develop data pipelines between platforms in a mission-critical environment using DMS, Fivetran, Kafka, etc.
- Work with our developers to optimize and tune their SQL queries.
- Assist in application development, debugging, and optimization with respect to data concerns.
- Provide input in the design and implementation of backup, recovery, and DR strategy
- Review application and database design for compliance.
What You’ll Need
- Ability to script in a high-level language like Python, Ruby, etc for Automation.
- 8+ years of experience in AWS database services like RDS (Mysql), MongoDB/ Cassandra.
- 4+ years of experience in database design and DevOps on AWS.
- Must demonstrate a clear understanding of logical and physical database design and standards.
- Experience in columnar data warehouse solutions like Redshift or Snowflake.
- Must have extensive hands-on experience with MySQL in a large scale 24x7 production environment with millions of records.
- Bachelor of Science in Computer Science, Mathematics, or Engineering; or equivalent work experience
Bonus Points If You Have
- Experience with NoSQL solutions such as Cassandra/MongoDB etc.
- Experience with software development life cycle (SDLC) in an agile environment
- Hands-on production experience with Big Data applications such as Spark.
- Knowledge of development best practices (source control with Git, continuous integration, automated testing).