Job description
Ruby on Rails Developer Responsibilities :
- Designing and developing new web applications.
- Maintaining and troubleshooting existing web applications.
- Writing and maintaining reliable Ruby code.
- Integrating data storage solutions.
- Creating back-end components.
- Identifying and fixing bottlenecks and bugs.
- Integrating user-facing elements designed by the front-end team.
- Connecting applications with additional web servers.
- Maintaining APIs.
Ruby on Rails Developer Requirements :
- Bachelor's degree in Computer Science, Computer Engineering, or related field.
- Experience working with Ruby on Rails as well as libraries like Resque and RSpec.
- Ability to write clean Ruby code.
- Proficiency with code versioning tools including Git, Github, SVN, and Mercurial.
Similar jobs
Request you to complete the assignment as early as possible to start the interview process.
https://docs.google.com/forms/d/e/1FAIpQLSczf-t9l-CHqI8BpiGNndxTtzZz7fhzKsvxyjD-w9Fe_QWcMw/viewform?vc=0&c=0&w=1&flr=0" target="_blank">https://docs.google.com/forms/
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
Position (Designation/Job Title): PHP Laravel Developer
We're looking for a self-motivated PHP / Laravel Developer.
Applicants must have a strong desire to create amazing software and be able to work in a fast-paced, collaborative atmosphere.
No. of Positions: 1
Employment Type : Full-Time(In-Office)
Job Location: New Delhi | Rajouri Garden
Qualification & Experience: B.tech/M.C.A or Any graduate & 1-4 Years
Salary range: Based On the Interview Performance
Male/Female/Any: Any
Company Name & address of the company:
Artistic Bird tech Pvt. Ltd. A2,3rd floor, Maharana Pratap Market, Rajouri Garden, Near Rajouri Metro Station, New Delhi - 110027
Job Timings: 10 am - 7 pm
Weekly off: 1st and 3rd Saturday off
Contact Person:- Akshay Patil
Technical Skills:-
PHP, Laravel- Framework, Vue.js, JavaScript, MySQL, Word press, SQLite, Third party APIs Integrations, git hub, core PHP, Software module, applications' development procedures, Architecture of database, and standard components.
For any query, you can contact me.
Have a good day!!
Thanks
Akshay Patil
Team - Artistic Bird Tech Pvt. Ltd.
JOB DESCRIPTION
DYT - Do Your Thing, is an app, where all social media users can share brands they love with their followers and earn money while doing so! We believe everyone is an influencer. Our aim is to democratise social media and allow people to be rewarded for the content they post. How does DYT help you? It accelerates your career through collaboration opportunities with top brands and gives you access to a community full of experts in the influencer space.
RESPONSIBILITIES
- Expert in Python with knowledge of Python best practices (PEP8)
- Strong knowledge of python web frameworks such as Django, Flask • Strong knowledge of building RESTful APIs using Django Rest Framework • Good Understanding of Django ORM Libraries
- Able to integrate multiple data sources and databases into one system • Strong experience on Linux
- Solid database skills in a relational database (i.e. PostgresSQL,MYSql) • Able to create database schemas that represent and support business processes • Strong unit test and debugging skills
- Proficient understanding of code versioning tools (git)
- Experience deploying on AWS is desirable
- Experience with Docker,Test Drive Development will be a plus
- Excellent interpersonal, leadership, influence and communication skills • Experience in designing scalable micro-services is desirable
QUALIFICATIONS
- 1-3 years of experience as a backend developer
- At least one product build and published
- SKILLS Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Work well under pressure and meet deadlines without sacrificing quality • Work with distributed development teams
*Performs analysis of functional and business requirements
*Applies in-depth or broad technical knowledge to maintain data engineering functions performs solution design.
*Applies the company, open source, and 3rd party technologies to highly complex infrastructure and software solutions.
*Introduce new product features and enhance the platform
*Ability to translate business reporting requirements into a production report.
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
Type: Individual contributor with good hands on proficiency.
Must have
- Strong proficiency in at least one of Java, Ruby, Python
- Exposure to databases: any of PostgreSQL, MySQL, Apache Cassandra
- Any NoSQL database experience is a plus
- Exposure to AWS cloud infrastructure: EC2 or S3
- Proficiency with Git
- MUST: Using REST to make API calls.
Great to have:
- Experience working with one or more middleware, enterprise bus, queueing frameworks
- Any of Memcached/Redis, Apache Kafka / RabbitMQ / PubSub+ / AmazonMQ
Soft skills:
- Appreciation for clean and well documented code
What will you do at Tradyl:
(Examples for illustration only)
- Build a shipping service module that is called by our website to query shipping rates from India to a destination country. Configure this to so that an Ops person can update shipping costs as and when they change. Own deployment and monitoring of this service.
- Use Zapier to build a workflow to export a MixPanel report into a Google sheet every day.
- Change our supplier portal (built on bubble.io) to make an API call to our customer facing site, whenever a supplier modifies his profile.
- Write an alert mechanism that identifies catalogues with insufficient information and makes them non discoverable, which can run every day.
- Work with Business Team to design a workflow for product inwarding using Airtable. Write a small app within Airtable so that whenever a product is updated as “shipped” in airtable, it updates the customer facing website.
- Use an open source dashboarding framework to create a quick dashboard to track important business events.
• Construct, develop, code, debug and maintain applications.
• Conform to define software design methodology for the development and implementation of Internet based application to support all aspects of web site functionalities.
• Perform code review and evaluation and determine recommendations for adaptation.
• Creation of Low-Level Design Document from Functional Specification and Technical design document.
• Generate application test data as necessary and validate any data conversion requirements for final implementation and production roll out.
• Being responsible for the analysis, design and development of certain key business application
Skills Required
I. Hands on experience on Dialogic/Telesoft SIGTRAN/TDM Stack and its configurations
II. Understanding and implementation of SS7 signalling servers
III. Good experience on SS7 protocols like MTP, SCCP, TCAP, ISUP, MAP layers and SMPP Protocol
IV. Must have Strong C and C++ programming skills on Linux developing multi- threaded applications.
V. Working knowledge of MySQL and other relational DB.
VI. Will add more advantage, if Working knowledge of any products like SMSC, MCA, USSD, OBD, IB
Education: B.Tech/BE or MCA (Computers) Regular
- Develop and manage e-commerce websites, web applications & web sites.
- Analyze, design, code, debug, test, document & deploy applications.
- Participate in project & deployment planning.
- Must be a self-starter & be able to work with minimum supervision
- Exp. In modules/extensions development/customization.
- Exp. In Theme integration/customization.
- Exp. In API creation/integration.
- Exp. In Migration from Magento1 to Magento2
- Extensive experience of PHP and MySQL.
- Exposure on Magento 2, CMS and JavaScript frameworks such as jQuery.
- Demonstrable knowledge of XML, XHTML, CSS, Modules i.e. API integration,
- Payment Gateways, XML with a focus on standards.
- Demonstrable source control experience
- Two or more published websites in E-Commerce