11+ Good Clinical Practice Jobs in Pune | Good Clinical Practice Job openings in Pune
Apply to 11+ Good Clinical Practice Jobs in Pune on CutShort.io. Explore the latest Good Clinical Practice Job opportunities across top companies like Google, Amazon & Adobe.
- Sr. Data Engineer:
Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python
Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred
Major accountabilities:
- Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
- Have good understanding on Foundry Platform landscape and it’s capabilities
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
- Designs data integrations and data quality framework.
- Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
- Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
- Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed
Desired Candidate Profile :
- Strong data engineering background
- Experience with Clinical Data Model is preferred
- Experience in
- SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
- Java and Groovy for our back-end applications and data integration tools
- Python for data processing and analysis
- Cloud infrastructure based on AWS EC2 and S3
- 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
- 5+ years of Python and Pyspark development experience
- Strong troubleshooting and problem solving skills
- BTech or master's degree in computer science or a related technical field
- Experience designing, building, and maintaining big data pipelines systems
- Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
- Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
- Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
- Hand-on experience in AWS / Azure cloud platform and stack
- Strong in API based architecture and concept, able to do quick PoC using API integration and development
- Knowledge of machine learning and AI
- Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.
Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision
- Strong Product Designer profile, with both UI and UX works for B2C Products
- Mandatory (Experience 1) - Must have a 1+ YOE in end-to-end Product Design, including UX work like UX Research, User Persona, Workflows
- Mandatory (Experience 2) - Expertise in tools like Figma, Sketch, Adobe XD, and others to create high-fidelity wireframes and prototypes
- Mandatory (Portfolio) - Must have a strong portfolio of UI/UX of Good B2C products. Portfolio must show detailed case studies including Wireframing, Prototype, Interaction Design and Visual Designs
Preferred
- Preferred (Company) – Product Companies
> Strong knowledge of the Lightning framework
> Experience working with Sales or Service Cloud, Community Cloud, Marketing Cloud
> Experience in REST & SOAP APIs, knowledge of Governor limits involved during Integration
> Working knowledge on continuous integration, working with repositories (e.g. Git)
> Should have the experience to design the data flow and work with any ETL or data loader
About the company
Credit cards haven't changed much for over half a century so our team of seasoned bankers, technologists, and designers set out to redefine the credit card for you - the consumer. The result is OneCard - a credit card reimagined for the mobile generation. OneCard is India's best metal credit card built with full-stack tech. It is backed by the principles of simplicity, transparency, and giving back control to the user.
The Engineering Challenge
“Re-imaging credit and payments from First Principles”
Payments is an interesting engineering challenge in itself with requirements of low latency, transactional guarantees, security, and high scalability. When we add credit and engagement into the mix, the challenge becomes even more interesting with underwriting and recommendation algorithms working on large data sets. We have eliminated the current call center, sales agent, and SMS-based processes with a mobile app that puts the customers in complete control. To stay agile, the entire stack is built on the cloud with modern technologies.
Check out our apps here:
OneCard (Best credit card app) : http://www.getonecard.app/">www.getonecard.app
OneScore (5 million downloads): http://www.onescore.app/">www.onescore.app
Senior Software Engineer – Frontend
Create a consumer facing front end application that will be used by millions of users.
What you will do:
- Develop our mobile apps using React Native.
- Work with backend, frontend, and other developers to build out a customer-centric experience that will constantly evolve.
- Work with our designers and customer success team to create a seamless user experience.
- Respond to support team tickets as needed to resolve bugs and issues.
- Participate in contributing ideas, updates, and product development areas to the team.
Experience Range:
3-5 years with technical hands-on experience in building a consumer mobile app. You must have shipped an app to the stores.
Technical Expertise:
- 3-5 years experience Android/iOS native apps or React Native based apps and Web front-end technologies such as React.
- Strong command of JavaScript ES6.
- Know how to build (fairly) complex layouts without using any CSS frameworks.
- Strong problem solving and analytical skills using data structures.
- Comfortable in a fast-paced startup environment.
- Bias for action shipping quality code quickly.
- Minimum 1 years of relevant experience, in PySpark (mandatory)
- Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus
- Ability to play lead role and independently manage 3-5 member of Pyspark development team
- EMR ,Python and PYspark mandate.
- Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
JD –
Having experience in Implementing Integration Solutions using Oracle Integration Cloud Service.
Developed integration between SaaS application (Oracle Cloud ERP, Oracle Cloud HCM) and between SaaS and PaaS application
Should have worked extensively on minimum 3-4 Technology Adapters like File, Database, Oracle ERP & FTP adapter
Should have excellent skill in Web Service technologies such as XML, XPath, XSLT, SOAP, WSDL, and XSD
Experience in all phases of software development lifecycle from gathering requirements to documentation, testing, implementation and support Ability to troubleshoot technical and configuration issues
Should be able to communicate effectively with the functional & technical groups and various technical team members.
Ensure completion of tasks, milestones, and components including Technical specifications, design specifications, configurations, quality assurance, implementations, and project reviews
- 5+ years of Quality Assurance/Testing experience.
- 3+ years of Data Quality experience, or SDET experience with a focus on data, data warehousing, reporting, etc.
- 3+ years of Data Quality experience, or QA experience with a focus on Android, iOS, Roku, and connected devices,
- 3+ years of testing experience working within an Agile environment, and with Agile Management tools such as JIRA.
- Experience with Automation Framework development using Java.
- Experience with Performance Test Design, Development, and load testing execution.
- Design, create and maintain assets used to execute performance tests and contribute to the execution and monitoring of performance test executions using ApacheJMeter, LoadRunner, or similar tools.
- Working knowledge of JAVA, JVM, Spring Boot, data warehouse, data integration, SQL Server, apache Kafka, data streaming, big data, MongoDB, SQL, Web Services, microservices,ETL,change data capture (CDC), DevOps.
- Strong SQL experience, with knowledge of AWS Redshift, Snowflake, or columnar databases.
- Experience with reporting or analytics tools like Tableau or Mode.
- Experience working with Amazon Web Services, querying, and working with data in various AWS services.
- Programming experience in a language such as Python, Java, etc. for the purposes of parsing files and running queries.
- Experience with analytics implementations (network events, ad beacons, user action events, etc.) in a web or mobile application.
The Database Developer will perform day-to-day database management, maintenance and troubleshooting by providing Tier 1 and Tier 2 support for diverse platforms including, but not limited to, MS SQL, Azure SQL, MySQL,
PostgreSQL and Amazon Redshift.
They are responsible for maintaining functional/technical support documentation
and operational documentation as well as reporting on performance metrics associated with job activity and platform
stability.
Must adhere to SLAs pertaining to data movement and provide evidence and supporting documentation for incidents that violate those SLAs.
Other responsibilities include API development and integrations via Azure
Functions, C# or Python.
Essential Duties and Responsibilities
• Advanced problem-solving skills
• Excellent communication skills
• Advanced T-SQL scripting skills
• Query optimization and performance tuning familiarity with traces, execution plans and server
logs
• SSIS package development and support
• PowerShell scripting
• Report visualization via SSRS, Power BI and/or Jupityr Nootbook
• Maintain functional/technical support documentation
• Maintain operational documentation specific to automated jobs and job steps
• Develop, implement and support user defined stored procedures, functions and (indexed) views
• Monitor database activities and provide Tier 1 and Tier 2 production support
• Provide functional and technical support to ensure performance, operation and stability of
database systems
• Manage data ingress and egress
• Track issue and/or project deliverables in Jira
• Assist in RDBMS patching, upgrades and enhancements
• Prepare database reports for managers as needed
• API integrations and development
Background/Experience
• Bachelor or advanced degree in computer science
• Microsoft SQL Server 2016 or higher
• Working knowledge of MySQL, PostgreSQL and/or Amazon Redshift
• C# and/or Python
Supervisory/Budget Responsibility
• No Supervisory Responsibility/No Budget Responsibility
Level of Authority to Make Decisions
The Database Developers expedite issue resolution pursuant to the functional/technical documentation available.
Issue escalation is at their discretion and should result in additional functional/technical documentation for future
reference.
However, individual problem solving, decision making and performance tuning will constitute 75% of their time.




