April 2019 Jobs Report and Industry Update


Economics & Job Creation:


Life Sciences:
“Kids store 1.5 megabytes of information to master their native language”

“‘Smart’ pajamas could monitor and help improve sleep”

“Drug takes aim at cancer metastasis”

The Industrials:
“How measurable is online advertising?”

Human Capital Solutions, Inc. (HCS) www.humancs.com is a Retained Executive Search and Professional Recruiting firm focused in Healthcare, Life Sciences, the Industrials, and Technology. Visit our LinkedIn Company Page to learn more about HCS and receive weekly updates.

HCS has created the Prosperity at Work proposition which focuses on creating prosperous relationships between companies and their employees (associates). HCS assists companies in improving bottom line profitability by efficiently planning, organizing and implementing optimized, practical and value-added business solutions.


Back to Top



Economics & Job Creation:


Total nonfarm payroll employment increased by 196,000 in March, and the
unemployment rate was unchanged at 3.8 percent, the U.S. Bureau of Labor
Statistics reported today. Notable job gains occurred in health care and
in professional and technical services.

This news release presents statistics from two monthly surveys. The household
survey measures labor force status, including unemployment, by demographic
characteristics. The establishment survey measures nonfarm employment, hours,
and earnings by industry. For more information about the concepts and
statistical methodology used in these two surveys, see the Technical Note.

Household Survey Data

The unemployment rate remained at 3.8 percent in March, and the number of
unemployed persons was essentially unchanged at 6.2 million. (See table A-1.)

Among the major worker groups, the unemployment rates for adult men
(3.6 percent), adult women (3.3 percent), teenagers (12.8 percent), Whites
(3.4 percent), Blacks (6.7 percent), Asians (3.1 percent), and Hispanics
(4.7 percent) showed little or no change in March. (See tables A-1, A-2,
and A-3.)

In March, the number of long-term unemployed (those jobless for 27 weeks
or more) was essentially unchanged at 1.3 million and accounted for 21.1
percent of the unemployed. (See table A-12.)

The labor force participation rate, at 63.0 percent, was little changed
over the month and has shown little movement on net over the past 12 months.
The employment-population ratio was 60.6 percent in March and has been
either 60.6 percent or 60.7 percent since October 2018. (See table A-1.)

The number of persons employed part time for economic reasons (sometimes
referred to as involuntary part-time workers) was little changed at 4.5
million in March. These individuals, who would have preferred full-time
employment, were working part time because their hours had been reduced
or they were unable to find full-time jobs. (See table A-8.)

In March, 1.4 million persons were marginally attached to the labor force,
little different from a year earlier. (Data are not seasonally adjusted.)
These individuals were not in the labor force, wanted and were available
for work, and had looked for a job sometime in the prior 12 months. They
were not counted as unemployed because they had not searched for work in
the 4 weeks preceding the survey. (See table A-16.)

Among the marginally attached, there were 412,000 discouraged workers in
March, about unchanged from a year earlier. (Data are not seasonally
adjusted.) Discouraged workers are persons not currently looking for work
because they believe no jobs are available for them. The remaining 944,000
persons marginally attached to the labor force in March had not searched
for work for reasons such as school attendance or family responsibilities.
(See table A-16.)

Establishment Survey Data

Total nonfarm payroll employment increased by 196,000 in March, with notable
gains in health care and in professional and technical services. Employment
growth averaged 180,000 per month in the first quarter of 2019, compared
with 223,000 per month in 2018. (See table B-1.)

Health care added 49,000 jobs in March and 398,000 over the past 12 months.
Over the month, employment increased in ambulatory health care services (+27,000),
hospitals (+14,000), and nursing and residential care facilities (+9,000).

Employment in professional and technical services grew by 34,000 in March
and 311,000 over the past 12 months. In March, computer systems design and
related services added 12,000 jobs. Employment continued to trend up in
architectural and engineering services (+6,000) and in management and technical
consulting services (+6,000).

In March, employment in food services and drinking places continued its
upward trend (+27,000), in line with its average monthly gain over the prior
12 months.

Employment in construction showed little change in March (+16,000) but has
increased by 246,000 over the past 12 months.

Manufacturing employment changed little for the second month in a row (-6,000
in March, following +1,000 in February). In the 12 months prior to February,
manufacturing had added an average of 22,000 jobs per month. Within the
industry, employment in motor vehicles and parts declined in March (-6,000).

Employment in other major industries, including mining, wholesale trade,
retail trade, transportation and warehousing, information, financial
activities, and government, showed little change over the month.

The average workweek for all employees on private nonfarm payrolls increased
by 0.1 hour to 34.5 hours in March, offsetting a decline of 0.1 hour in
February. In manufacturing, the average workweek was unchanged in March at
40.7 hours, while overtime decreased by 0.1 hour to 3.4 hours. The average
workweek for production and nonsupervisory employees on private nonfarm
payrolls rose by 0.1 hour to 33.7 hours. (See tables B-2 and B-7.)

In March, average hourly earnings for all employees on private nonfarm
payrolls rose by 4 cents to $27.70, following a 10-cent gain in February.
Over the past 12 months, average hourly earnings have increased by 3.2 percent.
Average hourly earnings of private-sector production and nonsupervisory
employees increased by 6 cents to $23.24 in March. (See tables B-3 and B-8.)

The change in total nonfarm payroll employment for January was revised up from
+311,000 to +312,000, and the change for February was revised up from +20,000
to +33,000. With these revisions, employment gains in January and February
combined were 14,000 more than previously reported. (Monthly revisions result
from additional reports received from businesses and government agencies since
the last published estimates and from the recalculation of seasonal factors.)
After revisions, job gains have averaged 180,000 per month over the last 3 months.




Back to Top

Life Sciences:

“Kids store 1.5 megabytes of information to master their native language”

Learning one’s native language may seem effortless. One minute, we’re babbling babies. The next we’re in school reciting Martin Luther King Jr.’s “I Have a Dream” speech or Robert Frost’s poem “Fire and Ice.”

But new research from the University of California, Berkeley, suggests that language acquisition between birth and 18 is a remarkable feat of cognition, rather than something humans are just hardwired to do.

Researchers calculated that, from infancy to young adulthood, learners absorb approximately 12.5 million bits of information about language — about two bits per minute — to fully acquire linguistic knowledge. If converted into binary code, the data would fill a 1.5 MB floppy disk, the study found.

The findings, published today in the Royal Society Open Science journal, challenge assumptions that human language acquisition happens effortlessly, and that robots would have an easy time mastering it.

“Ours is the first study to put a number on the amount you have to learn to acquire language,” said study senior author Steven Piantadosi, an assistant professor of psychology at UC Berkeley. “It highlights that children and teens are remarkable learners, absorbing upwards of 1,000 bits of information each day.”

For example, when presented with the word “turkey,” a young learner typically gathers bits of information by asking, “Is a turkey a bird? Yes, or no? Does a turkey fly? Yes, or no?” and so on, until grasping the full meaning of the word “turkey.”

A bit, or binary digit, is a basic unit of data in computing, and computers store information and calculate using only zeroes and ones. The study uses the standard definition of eight bits to a byte.

“When you think about a child having to remember millions of zeroes and ones (in language), that says they must have really pretty impressive learning mechanisms.”

Piantadosi and study lead author Frank Mollica, a Ph.D. candidate in cognitive science at the University of Rochester, sought to gauge the amounts and different kinds of information that English speakers need to learn their native language.

They arrived at their results by running various calculations about language semantics and syntax through computational models. Notably, the study found that linguistic knowledge focuses mostly on the meaning of words, as opposed to the grammar of language.

“A lot of research on language learning focuses on syntax, like word order,” Piantadosi said. “But our study shows that syntax represents just a tiny piece of language learning, and that the main difficulty has got to be in learning what so many words mean.”

That focus on semantics versus syntax distinguishes humans from robots, including voice-controlled digital helpers such as Alexa, Siri and Google Assistant.

“This really highlights a difference between machine learners and human learners,” Piantadosi said. “Machines know what words go together and where they go in sentences, but know very little about the meaning of words.”

As for the question of whether bilingual people must store twice as many bits of information, Piantadosi said this is unlikely in the case of word meanings, many of which are shared across languages.

“The meanings of many common nouns like ‘mother’ will be similar across languages, and so you won’t need to learn all of the bits of information about their meanings twice,” he said.




Back to Top


“‘Smart’ pajamas could monitor and help improve sleep”

If you’ve ever dreamed about getting a good night’s sleep, your answer may someday lie in data generated by your sleepwear. Researchers have developed pajamas embedded with self-powered sensors that provide unobtrusive and continuous monitoring of heartbeat, breathing and sleep posture — all factors that play a role in how well a person slumbers. The “smart” garments could give ordinary people, as well as clinicians, useful information to help improve sleep patterns.

The researchers will present their results today at the American Chemical Society (ACS) Spring 2019 National Meeting & Exposition.

“Our smart pajamas overcame numerous technical challenges,” says Trisha L. Andrew, Ph.D., who led the team. “We had to inconspicuously integrate sensing elements and portable power sources into everyday garments, while maintaining the weight, feel, comfort, function and ruggedness of familiar clothes and fabrics. We also worked with computer scientists and electrical engineers to process the myriad signals coming from the sensors so that we had clear and easy-to-understand information.”

Getting enough quality sleep can help protect people against stress, infections and multiple diseases, such as heart and kidney disease, high blood pressure and diabetes, according to the National Institutes of Health. Studies have found that quality sleep also increases mental acuity and sharpens decision-making skills. Yet most people do not get enough sleep — or the right kind.

The National Sleep Foundation estimates that the sleep industry is booming, taking in nearly $29 billion in 2017. Although some manufacturers of smart mattresses claim the products can sense movement and infer sleep posture, they do not provide detailed information to the sleeper and are not portable for travel. Commercially available electronic bands worn on the wrist give information about heart rate and monitor how much total sleep the wearer gets. But until now, there has not been anything that a typical consumer could use to monitor posture and respiratory and cardiac signals when slumbering.

The key to the smart pajamas is a process called reactive vapor deposition. “This method allows us to synthesize a polymer and simultaneously deposit it directly on the fabric in the vapor phase to form various electronic components and, ultimately, integrated sensors,” Andrew says. “Unlike most electronic wearables, the vapor-deposited electronic polymer films are wash-and-wear stable, and they withstand mechanically demanding textile manufacturing routines.”

The “Phyjama,” as the University of Massachusetts, Amherst team calls it, has five discrete textile patches with sensors in them. The patches are interconnected using silver-plated nylon threads shielded in cotton. The wires from each patch end up at a button-sized printed circuit board placed at the same location as a pajama button. Data are wirelessly sent to a receiver using a small Bluetooth transmitter that is part of the circuitry in the button.

The garment includes two types of self-powered sensors that detect “ballistic movements,” or pressure changes. Four of the patches are piezoelectric. They detect constant pressures, such as that of a bed against a person’s body. These first-of-their-kind patches are used in different parts of the Phyjama so that the researchers can determine sleeping posture. However, this type of sensor cannot pick up the faint pressure from a beating heart. The triboelectric patch detects quick changes in pressure, such as the physical pumping of the heart, which provides information on heart rate. This is the first time such a sensor has been shown to detect tiny ballistic signals from the heart.

Andrew’s team has tested the garment on volunteers and validated the readings from the sensors independently. They also have applied for patents on the Phyjama. After Andrew partners with a manufacturer, she estimates the product could be on the market within two years for $100-$200.

Currently, the team is working on extending the technology to wearable electronic sensors that detect gait and send feedback to a monitor to help prevent falls. This application could find use in settings such as nursing homes and retirement centers, Andrews says.




Back to Top


“Drug takes aim at cancer metastasis”

Many cancers are relatively harmless at their site of origin, and it is only when they metastasize to sites like the brain, bones, lungs, and liver that they become especially dangerous. And so, in addition to stopping the growth of cancer at its primary site, an ongoing goal of cancer research is to keep cancer contained — to stop its ability to travel through the body. A University of Colorado Cancer Center study presented at the American Association for Cancer Research (AACR) Annual Meeting 2019 offers another step in an ongoing line of research aimed at exactly that.

Over the course of about a decade, the lab of Heide Ford, PhD, CU Cancer Center Associate Director and the David F. and Margaret Turley Grohne Chair in Basic Cancer Research, has shown that a “transcriptional complex” called SIX1/EYA can gift cells, and even nearby cells, with the ability to metastasize. Now the lab, in partnership with the lab of Rui Zhao, PhD, associate professor in Biochemistry and Molecular Genetics at CU Anschutz Medical Campus, and with the National Institutes of Health, has identified a compound that inhibits this action. When the group administered this yet-to-be-named compound to mouse models of breast cancer, they found that it could, “dramatically suppress breast cancer associated metastasis,” the study writes.

“A few years ago, we did a small-molecule screen,” Ford says. “Rui [Zhou] set up a high-throughput screen to identify compounds that would disrupt SIX1/EYA, and Juan Marugan and his team at the National Chemical Genomics Center miniaturized the screen and then used their compound libraries to perform a large-scale screen. We got a bunch of compounds and have been working to improve them ever since. Our lead compound is looking great — we don’t quite understand the mechanism of action yet, but in preliminary experiments it dramatically affects metastasis.”

Like many mechanisms in cancer, one factor making the story of SIX1/EYA and metastasis especially complex is that these are far from the only players. First, the Six1 gene itself is involved in the early development of many of the body’s tissues, including muscle, auditory, kidney, and craniofacial structures. But after early development, this gene goes quiet in most adult tissues — unless it is accidentally paired with EYA after development is complete, which can restart Six1’s action out of context.

The resulting SIX1/EYA pairing is a “transcriptional complex” that can regulate how often other genes are read and manufactured, effectively turning up and down gene expression. In the context of cancer, work in the Ford lab and elsewhere shows that SIX1/EYA is like a volume knob that magnifies signals transmitted through a network called TGFb. Cells on the receiving end of this TGFb signal go through a rather dramatic transformation, called an epithelial-to-mesenchymal transition, or EMT.

Epithelial cells can’t travel through the body. They must remain anchored to the tissues where they grow; if they become unanchored they die through a process called anoikis, or “the state of being without a home.” But cells with mesenchymal properties shrug off anoikis and thus can travel through the body more easily. So, the chain of cause-and-effect goes something like this: EYA interacts with SIX1; together SIX1/EYA turns on TGFb signaling (along with additional signaling pathways that promote migration and invasion), which induces cells and even nearby cells to undergo EMT, making these EMT cells suddenly able to travel. The result is that cancer cells that should be stuck in place become able to metastasize.

Ford’s drug stops this chain reaction at the first step, silencing SIX1/EYA.

“What we think our drug is doing in the tumors is it might be somehow reversing the EMT, making these cells unable to metastasize,” Ford says. “In fact, we didn’t have enough drug in this study and so we had to stop administering it in our animal models after only three weeks, but we measured metastasis out to nine weeks and it remained almost absent, implying that there is some sort of long-lasting effect that we wouldn’t have predicted.”

Because SIX1 has no role in most adult tissues, inhibiting its action should have few side-effects.

“We’ve done toxicity tests in collaboration with Dan Gustafson from Colorado State University, and a dose that was almost twice what we administered in our study still had no toxicity we could measure,” Ford says, noting that lack of toxicity means that in addition to exploring this compound as a single-agent therapy against cancer metastasis, it may be possible to combine SIX1/EYA inhibition with other therapies, without increasing toxicity.

In addition to grants from organizations including the National Institutes of Health and National Cancer Institute, Ford’s lab recently received a grant from SPARK Colorado, a program meant to speed the translation of promising basic science into clinical application. Ford hopes to use grant monies to, “hire chemists to make the drug more soluble, more stable, and more potent.”

Ford says that, intellectually, she and Dr. Zhao would like to know more about how, exactly, the drug works, for example knowing whether the compound is binding to SIX1 or to EYA to inhibit the complex’s action. But she says that, “If it works, people often don’t care how it works — we may not need to know the mechanism of action in order to keep moving forward. What we know from our initial tests, is that our drug inhibited metastasis substantially and so we hope it could help people not get new metastases.”




Back to Top

The Industrials:

“How measurable is online advertising?”

Researchers from Northwestern University and Facebook in March published new research in the INFORMS journal Marketing Science that sheds light on whether common approaches for online advertising measurement are as reliable and accurate as the “gold standard” of large-scale, randomized experiments.

The study to be published in the March edition of the INFORMS journal Marketing Science is titled “A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook,” and is authored by Brett Gordon of Northwestern University; Florian Zettelmeyer of Northwestern University and the National Bureau of Economic Research; and Neha Bhargava and Dan Chapsky of Facebook.

“Our findings suggest that commonly used observational approaches that rely on data usually available to advertisers often fail to accurately measure the true effect of advertising,” said Brett Gordon.

Observational approaches are those that encompass a broad class of statistical models that rely on the data “as they are,” generated without explicit manipulation through a randomized experiment.

“We found a significant difference in the ad effectiveness obtained from randomized control trials and those observational methods that are frequently used by advertisers to evaluate their campaigns,” added Zettelmeyer. “Generally, the current and more common methods overestimate ad effectiveness relative to what we found in our randomized tests. Though in some cases, they significantly underestimate effectiveness.”

Measuring the effectiveness of advertising remains an important problem for many firms. A key question is whether an advertising campaign produced incremental outcomes: did more consumers purchase because they saw an ad, or would many of those consumers have purchased even in the absence of the ad? Obtaining an accurate measure of incremental outcomes (“conversions”) helps an advertiser calculate the return on investment (ROI) of the campaign.

“Digital platforms that carry advertising, such as Facebook, have created comprehensive means to assess ad effectiveness, using granular data that link ad exposures, clicks, page visit, online purchases and even offline purchases,” said Gordon. “Still, even with these data, measuring the causal effect of advertising requires the proper experimentation platform.”

The study authors used data from 15 U.S. advertising experiments at Facebook comprising 500 million user-experiment observations and 1.6 billion ad impressions.

Facebook’s “conversion lift” experimentation platform provides advertisers with the ability to run randomized controlled experiments to measure the causal effect of an ad campaign on consumer outcomes.

These experiments randomly allocate users to a control group, who are never exposed to the ad, and to a test group, who are eligible to see the ad. Comparing outcomes between the groups provides the causal effect of the ad because randomization ensures the two groups are, on average, equivalent except for advertising exposures in the test group. The experimental results from each ad campaign served as a baseline with which to evaluate common observational methods.

Observational methods compare outcomes between users who were exposed to the ad to users who were unexposed. These two groups of users tend to differ systematically in many ways, such as age and gender. These differences in characteristics may be observable because the advertiser (or its advertising platform) often has access data on these characteristics and others, e.g., in addition to knowing the gender and age of an online user, it is possible to observe the type of device being used, the location of the user, how long it’s been since the user last visited, etc. However, the tricky part is that the exposed and unexposed groups may also differ in ways that are very difficult to measure, such as the users underlying affinity for the brand. To say that the ad “caused” an effect requires the research to be able to account for both observed and unobserved differences between the two groups. Observational methods use data on the characteristics of the users that are observed in attempt to adjust for both the observable and unobservable differences.

“We set out to determine whether, as commonly believed, current observational methods using comprehensive individual-level data are ‘good enough’ for ad measurement,” said Zettelmeyer. “What we found was that even fairly comprehensive data prove inadequate to yield reliable estimates of advertising effects.”

“In principle, we believe that using large-scale randomized controlled trials to evaluate advertising effectiveness should be the preferred method for advertisers whenever possible.”



Recent Posts