Recruitment#

Recruitment process#

The current document describes the stages of recruitment and the various evaluation points that occur during the process.

Launch of a recruitment round#

  • The epic planning managers are responsible for determining when the availability needs of their cells will require to launch a recruitment round. They communicate this need as much as possible in advance to the recruitment managers and the CEO - ideally at least 1 month before the availability is required, to provide enough time to complete the round.
  • A cell should always target having some extra availability, to allow to accept new projects without requiring too much last-minute recruitment, which is more stressful for the cell and the recruitment managers. On top of the required availability, plan for 1-2+ extra newcomers for availability margin, plus planning ahead to replace any newcomer that doesn't pass their trial.
  • The CEO publishes the job ad, which direct candidates to submit the recruitment form.
  • The CEO informs all recruitment managers of the upcoming start of a recruitment round, and creates a workflow board to track the progression of the round's process.
  • The recruitment managers who participate to the recruitment round include a ticket in their upcoming sprint for it, and note on the comments of the workflow board how many newcomers they are looking for.
  • At the beginning of the first sprint of the recruitment round, the CEO imports the candidatures in a spreadsheet accessible to the recruitment managers, except the hourly rate the candidates asked for which remains private, doing a first filtering of the candidatures to ensure they are within the range we can afford.

Pre-selection of candidates for interviews#

Recruitment managers do a pre-selection of candidatures, to invite for an interview. It is a very basic filter over the candidatures - we don't want to be too selective at this stage, as it can still be quite hard to tell whether a candidate would be a good fit just from that information. So there are few criterias, but they are strict - if a candidate fails to pass any of these, they are eliminated:

Contribution to third party projects#

We want to see at least one contribution (PR/patch) to a third party project, which isn't completely trivial (a small bug fix is fine, but just fixing a typo, spacing or a missing import isn't enough, you want to be able to have something to evaluate), and which has been merged by the third party project.

No exceptions on this, it's a hard rule. This is the main filter of the pre-selection. So we check this first, and generally someone saying explicitly that they don't have contributions is enough to rule them out -- in these cases we save time by not having to look at the rest of the candidature.

Precisions:

  • The type of work/tech from the contributions don't need to be related to our work.
  • The recipient project can be small, but should be something that has users (see its number of stars & forks - there should be at least 10-20 of each).
  • PRs done as part of an employment are ok (that's also what we do!), but it should really be done openly, and still preferably to a third party. If the PRs are all silently merged, it means there was either no review, or it happened privately, and this doesn't really qualify as an open source contribution
  • There should also be at least a PR description, and some comments/discussions with upstream - we are looking for people who communicate.
  • We are trying to filter for people who care about contributing to someone else's project, so merely releasing code on their github, or even contributions to a project they are a maintainer of, doesn't count.
  • Since often candidates just point at their github account, we get all their third-party PRs by visiting this URL (we check this for both the github & gitlab accounts when they are provided):
  • Github: https://github.com/pulls?q=is%3Apr+author%3Aviadanna+-org%3Aviadanna
  • Gitlab: https://gitlab.com/dashboard/merge_requests/?scope=all&state=all&author_username=antoviaque

Proper writing skills#

Candidates don't need to have flawless spelling & grammar, but it needs to be reasonable. We think about whether the type of writting would work with a client for example. We aren't too harsh though - it can be hard to tell at times, and we can give the benefit of the doubt.

Python, Django & React#

We require experience in at least Python & Django, plus preferably React. Sometimes it's unclear - some candidates don't specify some of their experience... So we wouldn't necessarily eliminate a promising candidate who doesn't list one of those (and React is not mandatory either, just very appreciated), but we will take a note of any doubt on these, to ask during the interview.

Although if there is no mention or sign of any practice of Python, the candidate almost certainly has no Python experience, so we reject the candidature in these cases. Some candidates omit mentioning Django when they have only a small amount of experience with it, and it's still fine in that case after verification, but nobody fluent in a language omits mentioning it in their list of skills, especially when other languages are mentioned.

Seniority#

We also currently don't hire junior developers - from past experiences, the remote environment combined to the expected quality and efficiency doesn't work well with junior profiles, at least with our current organization. We might revisit this in the future, but we would need to put in place a specific process to allow them to acquire the required skills and experience.

At the moment, we require at least 2-3 years of professional experience as a hired developer. We sometimes make an exception for a prolific open source contributor who has demonstrated great technical and social skills in his contributions, and thus already shows a senior profile.

Also, we accept candidates who have been recently been employed by another Open edX provider, but we check for exclusivity clauses in their contract before proceeding with an interview (to be discussed with the CEO when someone from another provider applies).

Fields to fill#

In the spreadsheet containing the candidatures, besides the answers submitted by candidates, recruitment managers will see a few additional columns to fill:

  • Assigned to: The name of the recruitment manager assigned to review the candidature. We sometimes reassign some of them for the round of interviews, if there is a big imbalance -- which definitely happens, as a group of good candidatures often appear together in the spreadsheet :)

  • Status: The current status of the candidature (drop-down).

  • Python, Django, React: This is a reminder to write in the cell any of those skills for which the candidate isn't clearly experimented And then, during the interview, we ask the candidate about it. Sample value: "Django? React?" => which would be completed with the answer during the interview. For candidates which have all three pre-requirements, we put "OK" in this column - this helps ensuring that we remembered to check (or to ask).

  • Comments: Meant to contain the explanation for the recruitment manager's decision.

  • The other fields are for the interview itself - see below.

Review of pre-selections#

Recruitment managers review each other's selections. A column in the spreadsheet indicates the name of the reviewer for each candidate, beside the assignee who evaluates the candidate.

Scheduling interviews#

Emailing selected candidates#

Once both the assignee and reviewer for each candidate agree, the recruitment manager assigned to the candidature sends an email to the candidates they have selected. We use a standard email template for the content of that email.

Scheduling through Calendly#

We use Calendly to schedule interviews. Get an account from Xavier if you don't already have one, and setup a dedicated event for interviews:

  • Open for the week following the pre-selection
  • At times which allow a reasonable coverage of most timezones (the afternoon UTC time is usually good for that)
  • Make sure to keep the times narrow, to allow to batch the interviews - it's best for focus to not end up having them spread all over your days
  • Enable Calendar invitations, to automatically send Google Calendar invites
  • Link it to the OpenCraft Zoom account (to allow to host longer meetings), and enable the automated inclusion of a Zoom URL in the meeting invite

We need to record interviews to allow for later review by other team members. To ensure we don't forget to start the recording during the meeting, we enable the option ahead of time, in the scheduled meeting details. The setting for each individual scheduled meeting should look like this:

zoom_recording.png

Recording in the cloud offer the best/most reliable way to ensure the meeting will have been recorded.

To be able to keep the candidate's reactions visible in the recording, even when they are not talking, make sure to select "Record gallery view with shared screen" in your account settings:

zoom_recording_gallery.png

Interviews#

Script#

The interviews lasts 30 minutes, and we use a script. The script is private, to not demesurably advantage candidates who read the handbook ahead of the interview.

We don't necessarily say exactly and only the content of the script (we are not a call center ;p), but we try to stick to it, as the more similar it is across interviews, the better we are able to compare them with each other. This is especially true of the code exercise, where the way to explain it can influence significantly what the candidate will understand and how they will approach it.

Grading#

During the interview, we progressively grade the candidate in the corresponding columns of the spreasheet, with a short comment on each. The rating is 1 to 5, with 5 being the highest. E.g. "5 - aced the exercise!".

Video recording upload#

We then upload the video recording of the interview to our private file drive, using the 'Gallery view' file. Also add a link in the candidate's spreadsheet entry, in the dedicated column. This will allow other team members to review it.

Final selection of newcomers#

  • The recruitment manager who interviewed the candidate takes a decision on whether to hire the candidate as a newcomer.
  • The CEO reviews the selected candidates, confirms the decision, and contacts them to discuss contracting terms.
  • The recruitment managers send a rejection email to the candidates they have interviewed and who have been refused.

Onboarding & Trial Period Evaluation Process#

Once newcomers have signed their contract, been given access to the tools and joined the team, the onboarding and trial period starts.

Since newcomers may start at any time during the sprint, this process overlays the sprint process. Newcomers are expected to participate in sprint planning meetings, commit to tasks for the upcoming sprint, and practice time management using the sprint planning tools and by updating the Remaining Time estimate fields on their tasks.

As with all things at OpenCraft, this process is continually being reviewed and improved, so please provide any suggestions or feedback on your onboarding task.

Newcomer Weeks
Week -1 Prior to your arrival, we will arrange for a core team member to be your mentor and to review your onboarding task.
We'll also arrange your accounts and access to email, JIRA and the other communication tools.
Week 0 Work on your onboarding task, which involves reading documentation, completing the onboarding course, and setting up an Open edX devstack.
You'll also have a newcomer-friendly task assigned to work on in the first week, after finishing your onboarding.
Attend the 121 meeting scheduled by the reviewer of your onboarding task to say hello and discuss your progress.
If your devstack gives you trouble, be sure to ask your reviewer or on the Mattermost #devstack channel for help, and/or arrange a synchronous meeting to work through any issues.
Week 1 You've likely finished the onboarding course and your devstack setup, and are ready to work on a newcomer-friendly or other small task.
Reach out to your mentor or the sprint firefighter to help find tasks and a reviewer from the core team to help you.
To avoid spillover, we recommend against pulling new tasks into the current sprint in the first instance -- the review cycles can often take more time than expected. So instead, especially if a new sprint is starting soon, commit to a task in the next sprint, and work ahead.
Week 2 At the end of this week, your mentor and 2 other core team members will complete a screening review of your work so far.
This review exists to provide early feedback, and to identify extreme issues like a failure to communicate within 48h of pings on tickets and Mattermost, or cases where excessive time has been logged to tasks without sufficient explanation or outcomes. In this case, we would give notice that the trial period will end. But if you're communicating on your tasks and making progress, then your trial will continue as scheduled. Your mentor will pass on any feedback -- positive and negative -- from this review.
Week 3 By the end of this week, you should have completed some tasks, with story points totalling around 8-12 points. If you haven't, bring this up as soon as possible with your mentor.
If you've had spillover, consider what went wrong during these tasks and talk about it with your mentor.
Take care not to overcommit during the next sprints to get this under control. Time management is one of the hardest parts, so after each sprint ends, take care to ensure that the Sprint Commitments spreadsheet (linked from each cell's weekly sprint meeting) is accurate, and your spillover is improving as you progress through the trial period.
Week 4 By this time, depending on when you started, you've completed 2-3 sprints, so it's time to ensure that you're completing a breadth of tasks to showcase your skills.
Have you taken on increasingly difficult tasks?
Have you submitted a PR to the Open edX platform?
Have you launched appservers or contributed to Ocim?
Have you completed any devops tasks?
Have you been the primary reviewer on some tasks?
If not, try to find tasks for the next sprints which would fill these gaps, and discuss any cell-specific expectations with your mentor.
Week 7 This week will be your developer review.
All the core team members in your cell (plus one developer from each other cell) will review your tasks, PRs, and communications, and vote on whether to accept you into the core team, extend your trial period, or end your trial.
All reviewers have to agree to confirm a new core member. We each do our own evaluation independently, and then discuss if there's a difference of opinion.
Week 8 This marks the end of your initial trial period -- Xavier will meet with you to discuss the results of the developer review.
If you're joining the core team now, congratulations! There will be a small core team onboarding task to complete in your next sprint, and you can continue logging "onboarding" time to your onboarding ticket for a while.
If your trial period has been extended, that's great too! Xavier will provide specific details on the improvements required during the extension, and it's really important to focus on these areas during your extension.
Week 11 If your trial period was extended, the core team will do another developer review, focusing on your improvements during the last 2 sprints.
Week 12 This week marks the end of your extended trial period, if applicable. Xavier will let you know the results of the second developer review.

Evaluation criteria#

The screening and developer reviews will be evaluated on the following criteria:

  • Technical skills.
    Team members must demonstrate development and devops abilities on basic and complex tasks.
  • Time management and spillovers.
    Newcomers must have at least half of their sprints clean during their initial trial (2/4), or two thirds of their sprints clean for extended trials (rounded down, eg. 5/8). Confirmed core team members are expected to have at least 75% of their sprints clean. Sprint status is documented on the Sprint Commitments spreadsheet (linked from each cell's weekly sprint meeting).
  • Communication.
    See Roles: Communication for the expected response times, and the additional expectations for Newcomers.
  • Adaptability.
    Team members should respond gracefully to changes in task requirements and scope, communicate concerns and issues, and allocate effort appropriately across the current or follow-up tasks.
  • Potential for growth.
    Team members should demonstrate an enthusiasim for learning and improvement across all aspects of their work.

Here is some more detail about things the core team look for when evaluating newcomers:

  • Delivering On-Time: Avoiding spillover and delivering on schedule is really important in an environment where we make direct promises to clients about deliverables. Our reputation as an organization is on the line when we cannot deliver as we promised, so it matters tremendously to us to see a newcomer making deadlines consistently. It's required that you communicate explicitly when you feel there is going to be spillover, as soon as you can detect it, and try to find someone else who can complete or help you complete them. It’s totally ok to do this, and even welcomed by people who have time left in their sprint. We are a team, and we work together to avoid spillover.
  • Communication: As stressed above, as an international remote team, there is little progress we can make if we don't constantly communicate (with respect to not being interruptive if it isn't necessarily urgent). We promise you that we didn't recruit any mind readers! We won't magically figure anything out unless it's been talked about, through any of our multiple modes of communication. You should be communicating with your reviewers daily or every 2 days minimum on what your progress on their task is (by commenting on the JIRA tickets). Even if they have no questions, just stating status is important and can give reviewers/mentors somewhere to jump in and help. On the other hand, when blocked in a task, make sure to reach the reviewer for help. If the reviewer isn't available, you can reach for the sprint firefighters.
  • Show your skills: It's important to take tasks of progressive difficulty, take reviews on too. It's much easier for the core team to review your trial if you have picked varied tasks of different complexity and skillset. We’re looking for a cross-section of tasks across all our required work areas: full stack dev, devops, and ops.
  • "Nice": This point is in quotes because everyone obviously likes being around other nice people, so you'd assume this was obvious. But of course everyone believes, "Yeah, I'm nice!", but it goes a long way to being deliberately nice with your colleagues, and not just believing you are; they will simply enjoy working with you more.

Screening Review#

For the first complete sprint the newcomer is at OpenCraft, her/his mentor will schedule a screening review task assigned to himself and at least two other core members as reviewers. They'll evaluate the work of the newcomer in his first complete sprint and decide if the trial should go ahead.

End of trial, extensions and developer review schedules#

When a newcomer first joins OpenCraft, we set a date for the end of the trial and a cutoff date for developer reviews. The end of a trial is calculated taking into consideration the date the newcomer started working at OpenCraft and is based on the current practice of a four sprint trial period. This means the end of the trial date is exactly 56 days after the starting date.

end_of_trial = start_date + 56

To make sure the developer reviews are completed in time for a fair discussion, these must be completed at least seven days before the end of the trial or 7 weeks after the start:

review_deadline = end_of_trial - 7 = start_date + 49

In case the core team decides to accept the newcomer or to end the trial, the process is complete.

The core team can also choose to extend the trial period for two or four sprints, starting a new process similar to the original end of trial and developer reviews.

Similar to the original end-of-trial developer review, there will be an end-of-extension developer review. Depending on the duration of the extension, the end date of the extension will 28 or 56 days from the date of the 121 when the newcomer received the feedback and was notified about the extension:

end_of_extension = date_of_121 + 28
# or
end_of_extension = date_of_121 + 56

Again, the developer reviews deadline must be seven days before the end of the extension.

review_deadline = end_of_extension - 7

Special attention must be paid to the end of the trial and the end of the extension when the newcomer didn't start working at OpenCraft at the beginning of a sprint. In such cases, the review tasks may have to be scheduled a sprint earlier than expected to give enough time for any discussions.

In cases the newcomer joined at the beginning of a sprint, the developer reviews tasks must be completed in the first week of the last sprint of the trial/extension, with the second week for discussions.

Other references#

See also: