🧠 Project Name:

Automating Job Screening & Cover Letters with AI — My Full Set

🔧 What This Workflow Does:

An AI-powered system that can take scraped job listings, clean and summarize the job descriptions, and then generate personalized cover letters based on your resume — all automated.

✅ Tools Used:

  • Apify (cloud platform for web scraping)

  • Make.com (automation tool)

  • Flask API (Python) (Custom backend for summarization & cover letter generation)

  • Ngrok (secure and reliable connection between the public endpoint and my local machine)

  • HuggingFace (AI pipelines for summarization and text generation)

  • Slack (to receive drafts)

🗂️ Step-by-Step Tutorial:

🔹 Step 1: Getting Job Data

I started by scraping job listings from LinkedIn using Apify, which is an easy-to-use web scraping platform.

  1. Apify collects jobs with info like:

    • Job title

    • Company name

    • Job description

  2. Output is raw JSON, but this data can be messy or badly formatted — so cleaning it is important.

First step here would be to create an Apify account and select an actor from here: https://console.apify.com/actors.

The one I went with is this one: https://console.apify.com/actors/JkfTWxtpgfvcRQn3p/input by Umesh Patidar.

🔹 Step 2: Scrape Jobs with Apify and Send to Make

Get your Apify API Token

  • Go to [Apify Console → Settings → API & Integrations].

  • Click Create token, name it (e.g. "MakeBot"), and copy the token.

  1. Add Apify connection in Make.com

    • In your Make scenario, click modules, search for Apify, choose any Apify module (e.g. "Run an Actor").

    • On connection screen, select "API Token" and paste your token.

  1. Trigger Module:
    🔸 Apify -> Run an actor

    • Connect to your account

    • Insert actor id (https://console.apify.com/actors/JkfTWxtpgfvcRQn3p/input)

    • Trigger: Insert your JSON query. I’ve used this one:

      {

      "contract_type": ["F"],

      "location": "Denmark",

      "proxyConfiguration": {

      "useApifyProxy": true,

      "apifyProxyCountry": "DK"

      },

      "query": "Marketing",

      "remote": ["1"],

      "time_interval": "ANY"

      }

2. Pull the Scraped Data

Add a second module: Apify → Get Dataset Items

  • Use the output variable Dataset ID from the previous step.

  • Set Limit to a number equal to or larger than the items you expect (e.g., 10).

This module will convert the scraped JSON data into Make-friendly data arrays.

Step 3: Insert a “Set Variable” module

Insert a “Set Variable” module after the “Get Dataset Items” module.

  1. Set it up like this:

    • Variable name: jobs

    • Value: Click into the value field and map the entire output from "Get Dataset Items" → it might just be a list of objects, or look like bundle[] or similar.

Step 4: Add an Iterator Module

This breaks down each job post into individual rows

  1. Add: Tools > Iterator

  2. In “Array” field, choose the list of jobs from Apify output

Step 5: Transforming raw data into JSON code

This step takes raw job data and formats it into a proper JSON string for further processing.

This is important because the AI model expects structured data — and we need it clean to continue.

📥 Example input:

job_description: ...
resume: ...
job_title: ...
company: ...
  1. Add: Json > Transform to JSON

  2. Map the fields with the IDs from Apify

Step 6: Clean the job description by removing unnecessary text

A problem I kept encountering was that, on LinkedIn, most often than not you have to click “See More” to view the full job description of a position. The JSON format does not allow you to have such big spaces for a clean piece of code.

Thus I had to clean the code manually. I did this by adding a Text Parser.

Text Parser -> Replace to clean the job description by removing unnecessary text using regex.

Step 7: Format the clean JSON code

Add a JSON -> Parse JSON module. This parses the cleaned JSON data into a structured format for easier access.

Step 8: Add an HTTP module to send the job description

Module: “Make a request”

  1. Method: POST

  2. URL: https://<your-ngrok-link>.ngrok-free.app/summarize

  3. Headers:

    • Content-Typeapplication/json

  4. Body type: Raw

  5. Raw JSON:

{

"text": "{{description}}"

}

Replace {{description}} with the job’s description field from the JSON module before.

NOTE: For this step I decided to create my own AI interface locally. I used three key tools:

🧪 Flask

A lightweight Python web framework. I used it to build a small server on my laptop that listens for job data (title, description, resume) and returns a summary + a custom cover letter.

🧠 Hugging Face

This is where the AI models live. I used:

  • A summarization model (DistilBART) to condense long job posts

  • A text generation model (GPT-2) to write a cover letter based on the summary + resume

Everything runs locally — no paid APIs.

🌍 Ngrok

Since Flask runs only on my computer, I used Ngrok to create a temporary public URL. This lets Make.com or other services send data to my Flask server like it’s online.

This is completely optional and up to you. I will explain how I did it using the tools above.

Add a HTTP -> Make a request module. This sends the job data to a specified HTTP endpoint via a POST request.

This is the JSON query I used:

{

"job_description": "{{11.job_description}}",

"resume": "I'm a software engineer with experience in automation, Python, and Flask development. I enjoy solving problems and have contributed to various AI-powered projects.",

"job_title": "{{11.job_title}}",

"company": "{{11.company}}"

}

Map the IDs to the ones from the JSON module before this one.

Step 9 & last one: Add a Slack module to receive the final output

  • Go to https://api.slack.com/apps and click Create New App.

    1. Name it (e.g., “LinkedIn Draft Notifier”) and select your workspace.

    2. On the left menu, click Incoming Webhooks.

    3. Turn on Activate Incoming Webhooks.

    4. Scroll down, click Add New Webhook to Workspace.

    5. Choose the Slack channel where you want the messages (e.g., #personal-notifications).

    6. Click Allow.

    7. Copy the generated Webhook URL — you’ll need this in Make.

  • Choose Slack > Send a Message (via webhook).

  • Paste your Slack webhook URL.

  • For the message text, map the AI-generated draft content here.

    🎯 Final Output

    A fully automated job screening and cover letter generation tool

    ✔ Takes job posts from Apify (scraped automatically)
    ✔ Cleans and transforms messy job data into valid JSON
    ✔ Sends it to my local Flask server
    ✔ AI (from Hugging Face) summarizes the job and writes a custom cover letter
    ✔ Everything runs for free using open-source models (DistilBART + GPT-2)
    ✔ Ngrok exposes my local server to the internet, so Make.com can send data directly
    ✔ The final output (cover letter + summary) gets posted to Slack for review

    ✅ No paid APIs
    ✅ Runs on my machine
    ✅ Works with any job post + resume

    Built it once, and now it works on autopilot.

The final output can be improved in terms of formatting, overall phrasing and level of customization. I just wanted to prove functionality.