Campaign group Global Witness said it failed to prevent discriminatory targeting of ads and its algorithm was biased in choosing who would see them.
In an experiment, almost all Facebook users shown adverts for mechanics were men, while ads for nursery nurses were seen almost exclusively by women.
Facebook says its system shows people ads they may be most interested in.
Global Witness submitted two job ads for approval, asking Facebook not to show:
* one to women
* the other to anyone over the age of 55
And the social-media giant approved both ads for publication, although it did ask the organisation to tick a box saying it would not discriminate against these groups.
Global Witness pulled the adverts before they were published.
Facebook said: "Our system takes into account different kinds of information to try and serve people ads they will be most interested in and we are reviewing the findings within this report."
In 2019, a legal case was brought in the US over house-related adverts on Facebook the US Department of Housing and Urban Development alleged discriminated on the basis of ethnicity.
The social network has since agreed it would not allow discriminatory ads of this kind in the United States and Canada.
And it says it is exploring extending the limits on the targeting of job, housing and credit ads to other countries.
"The fact that it is possible to do this on Facebook in the UK is particularly shocking," Naomi Hirst, who led Global Witness's investigation, said.
But the campaign group is even more concerned by what it found out about how Facebook's system handled ads for which the recruiter did not specify a target audience.
Global Witness created four job ads, linked to real vacancies on the indeed.com platform, for nursery nurses, pilots, mechanics and psychologists.
The group specified only the ads should be seen by UK adults.
"That meant that it was entirely up to Facebook's algorithm to decide who to show the ads to," Ms Hirst said, "and what it decided appears to us to be downright sexist."
Of the people shown an ad for:
* mechanics, 96% were men
* nursery nurses, 95% were female
* airline pilots, 75% were men
* psychologists, 77% were women.
The algorithm is designed to ensure as many people as possible click on the ads - but Global Witness says it is perpetuating and even amplifying biases already built into recruitment.
Previously, for example, jobs for mechanics may have been advertised in magazines aimed at men.
"The difference here," Ms Hirst said, "is that if you are a woman looking for a job as a mechanic, you could just as easily go to a shop and buy that magazine as your male peer.
"It's just simply not true online."
Global Witness asked barrister Schona Jolly QC to examine its evidence.
And in a submission to the UK Equality and Human Rights Commission, she wrote: "Facebook's system itself may, and does appear to, lead to discriminatory outcomes."
Global Witness has also contacted the information commissioner about what it describes as the discriminatory practices resulting from the way Facebook processes data for job adverts.
Ravi Naik, a data-rights lawyer acting for Global Witness, said its concern was Facebook's advertising mechanisms might lead to the social network's customers breaching equality laws.
"That is massively consequential because Facebook's entire business model is advertising and if that business model results in discriminatory practices, that undermines the ability of Facebook to operate properly in this country," he added.