
- Editorialist YX is looking for a Technical Architect - Search. As part of this role, you will work with a team that builds a unified search platform to power various searches for our .com website,IOS app, and internal support tools. This search impacts thousands of customers in a day and will also become pivotal to our tech efforts as we continue to grow 30x YoY.
- You will own the technical direction for the team, and you will be leading key search projects from ideation all the way to deployment. You will be working closely with both technical and business leaders to fulfill your mission.
- Salary is no bar for the relevant candidate.
QUALIFICATION
- 6+ years of experience working in java and web services.
- 6+ years of experience working in the Search domain.
- Proven skills in designing scalable, highly available distributed systems which can handle high data volumes.
- Strong understanding of software engineering principles and fundamentals including data structures and algorithms.
- Solid understanding of concurrency and multi-threading, multiple design patterns, and debugging and analytical methodologies.
- Hands-on experience on Solr Cloud or ElasticSearch.
- Deep understanding of information retrieval concepts.
- Deep understanding of Linguistic processing like tokenizers, spellers, and stemmers.
- Hands-on experience on big data tech stacks, like Hadoop, Hive, Cassandra, and Spark is a plus.
- Self-directed, self-motivated, and detail-oriented with the ability to come up with good design proposals and thorough analysis of production issues.
- Excellent written and oral communication skills on both technical and non-technical topics.
RESPONSIBILITIES
- Designing and building a search engine using elastic search with engineers in the team for overall success for Search and other ML-based systems.
- Collaborate with peers from other Engineering groups to tackle complex and meaningful problems with efficient and scalable delivery of Search solutions.
- You are expected to be self-motivated, dedicated, and a solution-oriented individual. The main responsibilities for this position include:Leading effort to build large-scale, distributed, and highly available systems and pipelines.
- Leading effort to build large scale and highly available information retrieval systems
- Design and develop solutions using Java tech stack.
- Design and implement as per secure guidelines
- Work with QA to identify issues and fix them.
EDUCATION & EXPERIENCE
- B.Tech. in Computer Science or equivalent experience
- 6+ Yrs of experience in Java, Web services & Search Domain
- Experience working in Product based company
BENEFITS
- Retiral Benefits
- Medical Insurance
- Remote Working Opportunity for the time being
- MacBook
- Stock Options
- Gym Membership

About Editorialistyx
About
Connect with the team
Company social profiles
Similar jobs
- Handle inbound and outbound sales calls
- Qualify leads and understand customer requirements
- Present Triochat’s features and benefits clearly
- Schedule and conduct product demos
- Follow up with prospects and close deals
- Maintain and update leads in CRM
- Collaborate with product and support teams to improve customer experience
Prior experience in SaaS sales or B2B sales (preferred)
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.
The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Design, build, and maintain scalable data pipelines for structured and unstructured data sources
- Develop ETL processes to collect, clean, and transform data from internal and external systems
- Support integration of data into dashboards, analytics tools, and reporting systems
- Collaborate with data analysts and software developers to improve data accessibility and performance
- Document workflows and maintain data infrastructure best practices
- Assist in identifying opportunities to automate repetitive data tasks
Role: BDE
Exp: 0-4 Years
CTC: 3-5 LPA
Key Responsibilities:
1. Inquiring students to select the right course
2. Responsible for selling courses through outbound, inbound and online process.
3. Achieving weekly and monthly sales target.
4. Self-momtivated and organised capable of managing a high volume of customer interactions.
5. Counsel the working professionals related to the suitable courses as per their domain
6. Familiriaty with CRM software for effective lead management and reporting wilL be an added advantage.
7. Identifying opportunities for new business development through lead generation
8. Co-ordinate pre-sales and post-sales follow up.
9. Creating and maintaining a database of prospect clients
Role: SCM Ops Lead
Location: Sultanpur, Delhi
About the company:
They are are building a data and technology-driven, driver-centric platform to electrify last mile mobility in the country.
Our Fi-Ne-Tech platform is easing electric mobility adoption by solving three major problems for drivers: access to credit, low earnings, and poor asset life.
It is one stop solution of easy financing, predictable asset management and wider distribution network of energy and service will be powering a community of million drivers towards financial security.
Scope of Work
1. The incumbent will be responsible for stitching the process for fulfilling the demand
2. Have full hold on warehousing, products being handled, inventory management
3. Supply/demand planning, preparing and forecasting for future demands, maintaining MoQ
4. Logistics management for inbound and outbound transportation of materials
Reporting: Head of Customer Success
Growth Path:
With the zeal to learn and grow, the person can delve into any of the following roles and more:
1. Supply Chain Manager
2. Logistics/Inventory Manager
3. Procurement and Process Manager
So, if you are
● Great with coordination and documentation
● Understand the concept of ‘ZERO’ inventory
● Smart enough to forecast and manage the supplies as per demand
● And negotiation is your key skill
- Experience in onsite functional or technical roles on at least four full-lifecycle CPQ projects, with strong implementation expertise in at least two of the following technologies: Conga CPQ and CLM, Salesforce CPQ & billing, Vlocity CPQ
- Two+ years consulting and/or Lead to- Cash (CPQ/ CLM) implementation experience. Experience managing at least two large-scale full-life cycle implementations of Lead to Cash (CPQ/ CLM) solutions, including ownership of the technical solution, management of the overall team and ownership and management of project financials.
- Deep Understanding of the Lead-To-Cash business process and supporting technologies
- Experience defining systems strategy, road-map, developing process flow diagrams, developing systems requirements, designing and prototyping, testing, training, defining support procedures, and implementing practical business solutions.
- Experience supporting pursuit teams.
- Desire to learn additional in-demand CPQ/CLM platforms
- Ability to travel up to 50% on average, based on the work you do and the clients and industries/sectors you serve
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Preferred Qualifications:
- A Bachelor’s degree
- Salesforce or Conga CPQ certified
- Ability to work independently and manage multiple task assignments
- Strong oral and written communication skills, including presentation skills (MS Visio, MS PowerPoint)
- Strong problem solving and troubleshooting skills with the ability to exercise mature judgment
- An advanced degree in the area of specialization
- Experience implementing other CRM platforms (SAP CRM, Oracle, Salesforce.com, Microsoft Dynamics, etc.) or CPQ/CLM technologies (Oracle, Salesforce, Conga, , etc.)
- Proficient in all phases of the Application Development Lifecycle
- Strong technical project management and/or leadership skills
Roles and Responsibilities:
- Understand product requirements and be able to quickly turn around a functional prototype for internal review and further refinement.
- We are currently using MERN & Flutter technology for our product. You should be acquainted to this tech stack or should be able to grasp immediately.
- Develop API based architecture in order to ensure the business growth and integrational capabilities of product.
- Designing and developing REST APIs using MongoDB with its Security compositions
- Develop the Mobile UI / Back-end features using Flutter / Dart programming language.
- Creating and configuring EC2 instances and securing a connection with MongoDB Cloud database
- Should be adapt to version control systems like GIT / GITHUB
Job Description
We are looking for a highly capable machine learning engineer to optimize our deep learning systems. You will be evaluating existing deep learning (DL) processes, do hyperparameter tuning, performing statistical analysis (logging and evaluating model’s performance) to resolve data set problems, and enhancing the accuracy of our AI software's predictive automation capabilities.
You will be working with technologies like AWS Sagemaker, TensorFlow JS, TensorFlow/ Keras/TensorBoard to create Deep Learning backends that powers our application.
To ensure success as a machine learning engineer, you should demonstrate solid data science knowledge and experience in Deep Learning role. A first-class machine learning engineer will be someone whose expertise translates into the enhanced performance of predictive automation software. To do this job successfully, you need exceptional skills in DL and programming.
Responsibilities
-
Consulting with managers to determine and refine machine learning objectives.
-
Designing deep learning systems and self-running artificial intelligence (AI) software to
automate predictive models.
-
Transforming data science prototypes and applying appropriate ML algorithms and
tools.
-
Carry out data engineering subtasks such as defining data requirements, collecting,
labeling, inspecting, cleaning, augmenting, and moving data.
-
Carry out modeling subtasks such as training deep learning models, defining
evaluation metrics, searching hyperparameters, and reading research papers.
-
Carry out deployment subtasks such as converting prototyped code into production
code, working in-depth with AWS services to set up cloud environment for training,
improving response times and saving bandwidth.
-
Ensuring that algorithms generate robust and accurate results.
-
Running tests, performing analysis, and interpreting test results.
-
Documenting machine learning processes.
-
Keeping abreast of developments in machine learning.
Requirements
-
Proven experience as a Machine Learning Engineer or similar role.
-
Should have indepth knowledge of AWS Sagemaker and related services (like S3).
-
Extensive knowledge of ML frameworks, libraries, algorithms, data structures, data
modeling, software architecture, and math & statistics.
-
Ability to write robust code in Python & Javascript (TensorFlow JS).
-
Experience with Git and Github.
-
Superb analytical and problem-solving abilities.
-
Excellent troubleshooting skills.
-
Good project management skills.
-
Great communication and collaboration skills.
-
Excellent time management and organizational abilities.
-
Bachelor's degree in computer science, data science, mathematics, or a related field;
Master’s degree is a plus.
Responsibilities and Duties
- Should have good knowledge and experience 2+ years of experience in core Php, Opencart, Magento, Wp, Shopify Etc
- Produce detailed specifications
- Maintain and optimization and functional code
- Contribute to all phases of the development lifecycle
- Develop and deploy new features to facilitate related procedures and tools.
- Focus on Unit Testing and ensure the test-driven process
Qualifications and Skills
- Proven software development experience in PHP, Web services, WSDL, XML, and SOAP
- Familiarity with the Model-View-Controller (MVC) architectural pattern, and previous experience developing web applications using this pattern or on existing MVC frameworks
- Good knowledge of relational databases, experience interacting with MySQL database systems through an abstraction layer, in addition to the ability to write raw SQL queries.
- Front-end development experience with strong Javascript skills (including frameworks/libraries such as jQuery, AJAX, JSON) as well as familiarity with accepted CSS and HTML design standards.
- Experience in common third-party APIs i.e. (Google, Facebook, eBay, etc) will be an added advantage.
- Passion for best design and coding practices and a desire to develop new bold ideas.
|
· Advanced Spark Programming Skills · Advanced Python Skills · Data Engineering ETL and ELT Skills · Expertise on Streaming data · Experience in Hadoop eco system · Basic understanding of Cloud Platforms · Technical Design Skills, Alternative approaches |
· Hands on expertise on writing UDF’s · Hands on expertise on streaming data ingestion · Be able to independently tune spark scripts · Advanced Debugging skills & Large Volume data handling. · Independently breakdown and plan technical Tasks |









