Software Engineer in United Kingdom

26 days ago
Legalist
Mid Level
Full Time
Europe
Founded by Christian Haigh & Eva Shang in 2016, Legalist sits at the forefront of AI-enabled legal-asset finance. Backed by YCombinator, Refactor Capital & other Silicon Valley investors, we utilize cutting-edge artificial intelligence technology to invest in litigation-related situations. We use public court records and statistical analysis to invest in impending lawsuits by providing capital to lawyers & plaintiffs who need it most. Legalist currently manages over $200M across multiple legal asset strategies.

As Legalist continues to grow, we are looking for talented innovators to further build out our product, engineering, & investment teams. Start a conversation with us & join us in the journey of driving the future of legal assets.

We are looking for a Software Engineer to join our team. In this role, you will manage all aspects of Legalist’s business intelligence data, from collection, to enrichment and enhancement. You will play a pivotal role in elevating and building out our analytics capabilities. You will own the implementation of our various analytics tools, managing and building our pipeline from data collection to analysis-ready datasets.

Where You Come In:
  • Help to design and implement the data crawling architecture and a large-scale crawling system
  • Work on developing not just large-scale scraping tools and APIs but also data integrity, health and monitoring systems.
  • Collaborate with our product and business teams to understand / anticipate requirements to strive for greater functionality and impact in our data gathering systems 
  • Design, implement, and maintain various components of our data infrastructure

What you’ll be bringing to the team:
  • 2+ Years experience with Python for data wrangling and cleaning
  • Advanced experience with SQL 
  • Proficiency in crawling & scraping using any libraries such as Scrapy, BeautifulSoup, Selenium
    • Experience extracting data from multiple disparate sources including Web, PDF, and spreadsheets.
    • Productionized experience with techniques and tools for crawling, extracting and processing data 
  • Sound Knowledge in bypassing Bot Detection Techniques
    • Experience using HTTP Proxy techniques to protects web scrapers against site ban, IP leak, browser crash, CAPTCHA and proxy failure.
  • Experience with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, etc
  • Ability to maintain all aspects of an analytics tech stack (analytics tool implementation, pipeline maintenance, data integrity)
  • Expertise in data warehouse maintenance, specifically with Google BigQuery (ETLs, data sourcing, modelling, cleansing, documentation, and maintenance)

Even better if you have, but not necessary:
  • Experience with micro services architecture would be a plus.
  • Familiarity with message brokers such as Kafka, RabbitMQ, etc
  • Experience with DevOps

Benefits:
  • Competitive salary
  • Health, dental, and vision
  • Annual Company Retreat
  • Twelve paid vacation days plus ten bankable federal holidays

Note: Remote / US Visa Sponsorship Available