#discoveries

waynerad@diasp.org

Plastics. We think so much about the exponential growth of computer technology that we forget about other things growing exponentially.

It's like that line from the 1967 movie, "I just want to say one word to you... just one word: Plastics."

We may be paying more attention to computers, but plastics haven't gone away. They've kept growing exponentially.

I'm going to quote a boatload of stuff from this PlastChem report, but, in actuality it's going to look like a lot but in reality it's about 2 pages from an 87-page report. Actually once you add in all the appendices it's a 181-page report. And I'm going to quote these parts rather than summarize because I can't think of a way to compress this down much further, so since I can't do a better job of it myself I'm just going to present the choice quotes from the report. Here we go:

"The plastics economy is one of the largest worldwide. The global plastic market was valued at 593 billion USD in 2021. In the same year, the global trade value of plastic products was 1.2 trillion USD or 369 million metric tons, China, the USA, and European states are the major plastic-producing countries with emerging economies experiencing a rapid expansion of local production capacities. The plastics economy is tightly embedded in the petrochemical sector, consuming 90% of its outputs to make plastics. This, in turn, creates strong linkages with the fossil industry, as 99% of plastic is derived from fossil carbon, production mostly relies on fossil energy, and the plastic and fossil industries are economically and infrastructurally integrated."

"Plastic production increases exponentially. The global production of plastics has doubled from 234 million tons in 2000 to 460 million tons in 2019 and its demand grows faster that cement and steel. On average, production grew by 8.5% per year from 1950-2019. Business-as-usual scenarios project that plastic production will triple from 2019 to 2060 with a growth rate of 2.5-4.6% per year, reaching 1230 million tons in 2060. By 2060, 40 billion tons of plastic will have been produced, with about 6 billion tons currently present on Earth."

"The projected increase in plastic use is driven by economic growth and digitalization across regions and sectors. China is expected to remain the largest plastic user, but plastic demand is expected to grow stronger in fast-growing regions, such as Sub-Saharan Africa, India, and other Asian countries. Plastic use is projected to increase substantially across all sectors until 2060, and polymer types used in applications for packaging, construction and transportation make up the largest share of the projected growth.14 Importantly, the OECD predicts that petroleumbased, non-recycled plastics will continue to dominate the market in 2060. Single-use plastics, currently 35-40% of global production, are expected to grow despite regional phase-outs."

"Globally, seven commodity polymers dominate the plastics market. These include polypropylene (PP, 19% of global production), low-density polyethylene (LDPE, 14%), polyvinylchloride (PVC, 13%), high-density polyethylene (HDPE, 13%), polyethylene terephthalate (PET, 6%), polyurethane (PUR, 6%), and polystyrene (PS, 5%). Over 80% of Europe's total polymer demand is met by these, mostly in virgin form). Their usage varies by sector, with HDPE, LDPE, PET, and PP mainly being applied for packaging, and PS and PVC in construction."

"Plastic waste generation is expected to almost triple by 2060. In line with the growth in plastic use, the future plastic waste generation is projected to almost triple, reaching 1014 million tons in 2060. Waste generated from short-lived applications, including packaging, consumer products and textiles, and plastic used in construction are expected to dominate. The latter is relevant because long-lived applications will continue to produce 'locked-in' plastic waste well into the next century. Despite some improvements in waste management and recycling, the OECD projects that the amount of mismanaged plastic waste will continue to grow substantially and almost double to 153 million tons by 2060."

"The scale of plastic pollution is immense. The OECD estimates that 22 million tons of plastic were emitted to the environment in 2019 alone.14 While there are uncertainties in these estimates, they illustrate the substantial leakage of plastics into nature. Accordingly, approximately 140 million tons of plastic have accumulated in aquatic ecosystems until 2019. Emissions to terrestrial systems amount to 13 million tons per year (2019), but the accumulating stocks remain unquantified due to data gaps. While mismanaged waste contributes 82% to these plastic emissions, substantial leakages originate further upstream and throughout the plastic life cycle, such as from the release of micro- and nanoplastics. While the latter represent a relatively small share in terms of tonnage, the number of these particles outsizes that of larger plastic items emitted to nature."

"Plastic pollution is projected to triple in 2060. A business-as-usual scenario with some improvement in waste management and recycling predicts that the annual plastic emissions will double to 44 million tons in 2060. This is in line with other projections which estimate annual emissions of 53-90 million tons by 2030 and 34-55 million tons by 2040 to aquatic environments. According to the OECD, the accumulated stocks of plastics in nature would more than triple in 2060 to an estimated amount of 493 million tons, including the marine environment (145 million tons, 5-fold increase) and freshwater ecosystems (348 million tons, 3-fold increase). Since the impacts of plastic pollution are diverse and occur across the life cycle of plastics, the OECD concludes that 'plastic leakage is a major environmental problem and is getting worse over time. The urgency with which policymakers and other societal decision makers must act is high.'"

"Plastic monomers (e.g., ethylene, propylene, styrene) are mainly derived from fossil resources and then reacted (or, polymerized) to produce polymers (e.g., polyethylene, polypropylene, and polystyrene) that form the backbone of a plastic material. A mixture of starting substances (i.e., monomers, catalysts, and processing aids) is typically used in polymerization reactions. To produce plastic materials, other chemicals, such as stabilizers, are then added. This creates the so-called bulk polymer, usually in the form of pre-production pellets or powders. The bulk polymer is then processed into plastic products by compounding and forming steps, like extrusion and blow molding. Again, other chemicals are added to achieve the desired properties of plastic products, in particular additives. Importantly, such additives were crucial to create marketable materials in the initial development of plastics, and a considerable scientific effort was needed to stabilize early plastics. Throughout this process, processing aids are used to facilitate the production of plastics."

"At the dawn of the plastic age, scientists were unaware of the toxicological and environmental impacts of using additives in plastics. Their work to make plastic durable is essentially what has made plastics both highly useful, but also persistent and toxic."

"The growth in additives production mirrors that of plastics. The amount of additives in plastics can significantly vary, ranging from 0.05-70% of the plastic weight. For example, antioxidants in PE, PS, and ABS (acrylonitrile butadiene styrene) account for 0.5-3% of their weight. Light/UV stabilizers in PE, PP, and PVC constitute 0.1-10% by weight. Flame retardants can make up 2-28% of the weight, while plasticizers in PVC can be as high as 70% by weight. About 6 million tons of additives have been produced in 2016 and the annual growth rate is 4% in the additives sector. Accordingly, additive production can be expected to increase by 130-280 thousand tons per year. By 2060, the joint production volume of a range of additive classes is the projected to increase by a factor of five, closely mirroring the growth in overall plastic production."

"Plastics also contain non-intentionally added substances. Non-intentionally added substances include impurities, degradation products, or compounds formed during the manufacturing process of plastics, which are not deliberately included in the material. Examples include degradation products of known additives (e.g., alkylphenols from antioxidants) and polymers (e.g., styrene oligomers derived from polystyrene). Unlike intentionally added substances (IAS), which are in principle known and therefore can be assessed and regulated, non-intentionally added substances are often complex and unpredictable. Thus, their identity remains mostly unknown and these compounds, though present in and released from all plastics, cannot easily be analyzed, assessed, and regulated. Despite these knowledge gaps, non-intentionally added substances probably represent a major fraction of plastic chemicals."

"The number and diversity of known plastic chemicals is immense. A recent analysis by the United Nations Environment Programme suggests that there are more than 13 000 known plastic chemicals, including polymers, starting substances, processing aids, additives, and non-intentionally added substances. The main reason for such chemical complexity of plastics is the highly fragmented nature of plastic value chains that market almost 100 000 plastic formulations and more than 30 000 additives, 16,000 pigments, and 8000 monomers. While this represents the number of commercially available constituents of plastics, not necessarily the number of unique plastic chemicals, it highlights that the diversity of the plastics sector creates substantial complexity in terms of plastic chemicals."

"A full overview of which chemicals are present in and released from plastics is missing, mostly due to a lack of transparency and publicly available data. Nonetheless, the available scientific evidence demonstrates that most plastic chemicals that have been studied are indeed released from plastic materials and products via migration into liquids and solids (e.g., water, food, soils) and volatilization into air. Additional chemical emissions occur during feedstock extraction and plastic production as well as at the end-of-life (e.g., during incineration). This is problematic because upon release, these chemicals can contaminate natural and human environments which, in turn, results in an exposure of biota and humans."

"Most plastic chemicals can be released. The release of chemicals from plastics has been documented in a multitude of studies, especially in plastic food contact materials, that is, plastics used to store, process or package food. A systematic assessment of 470 scientific studies on plastic food packaging indicates that 1086 out of 1346 analyzed chemicals can migrate into food or food simulants under certain conditions. Accordingly, 81% of the investigated plastic chemicals are highly relevant for human exposure. Newer research with advanced methods to study previously unknown plastic chemicals illustrates that this probably represents the tip of the iceberg. Studies using so-called nontargeted or suspect screening approaches show that commonly more than 2000 chemicals leach from a single plastic product into water. While less information is available on non-food plastics, this highlights two important issues. Firstly, plastics can release a large number of chemicals which, secondly, then become relevant for the exposure of biota, including humans (termed 'exposure potential' in this report)."

"Many plastic chemicals are present in the environment. Upon release, plastic chemicals can enter the environment at every stage of the plastic life cycle. Accordingly, plastic chemicals are ubiquitous in the environment due to the global dispersal of plastic materials, products, waste, and debris. For instance, a recent meta-analysis suggests that more than 800 plastic chemicals have been analyzed in the environment. However, this evidence is fragmented, and a systematic assessment of which compounds have been detected in the environment is lacking. Yet, the evidence on well-studied plastic chemicals indicates that these are present in various environments and biota across the globe, including remote areas far away from known sources. Examples include many phthalates, organophosphate esters, bisphenols, novel brominated flame retardants, and benzotriazoles. Based on the existing evidence on well-researched compounds, it is prudent to assume that many more plastic chemicals are omnipresent in the natural and human environment, including in wildlife and humans."

"Humans are exposed to plastic chemicals across the entire life cycle of plastics. This ranges from the industrial emissions during production, affecting fence line communities, to the releases during use, affecting consumers, and at the end-of-life, including waste handling and incineration. These releases have resulted in extensive exposures of humans to plastic chemicals. For example, many phthalates, bisphenols, benzophenones, parabens, phenolic antioxidants as well as legacy brominated and organophosphate flame retardants have been detected in human blood, urine, and tissues in different global regions. Humans can be exposed to plastic chemicals directly, such as phthalates and other additives leaching from PVC blood bags used for transfusion or leaching into saliva in children mouthing plastic toys. Indirect exposure occurs through the ingestion of contaminated water and foodstuffs that have been in contact with plastics (e.g., processing, packaging). The inhalation and ingestion of plastic chemicals from air, dust and other particulate matter are other important routes of exposure. Importantly, research shows that women, children, and people in underprivileged communities often have higher levels of exposure."

"Non-human organisms are exposed to plastic chemicals. The scientific literature provides rich information on the exposure of wildlife to plastic chemicals, in particular on bisphenols and phthalates in terrestrial and aquatic ecosystems as well as persistent organic pollutants, and antioxidants in marine environments. The United Nations Environment Programme highlights a global biomonitoring study which showed that seabirds from all major oceans contain significant levels of brominated flame retardants and UV stabilizers, indicating widespread contamination even in remote areas. Beyond seabirds, various other species are exposed to plastic chemicals according to the United Nations Environment Programme, such as mussels and fish containing with high levels of hazardous chemicals like HBCDD (hexabromocyclododecane), bisphenol A, and PBDEs (polybrominated diphenyl ethers), suggesting plastics as a probable source. Land animals, including livestock, are exposed to chemicals from plastics, such as PBDEs in poultry and cattle. and phthalates in insects. Importantly, plastic chemicals can also accumulate plants, including those for human consumption. This highlights a significant cross-environmental exposure that spans from marine to terrestrial ecosystems and food systems. However, while research on plastic chemical in non-human biota is abundant, it remains fragmented and has not been systematically compiled and assesses thus far."

"Endocrine disrupting chemicals in plastics represent a major concern for human health. The plastic chemicals nonylphenol and bisphenol A were among the earliest identified compounds that interfere with the normal functioning of hormone systems. These findings marked the beginning of a broader recognition of the role of plastic chemicals in endocrine disruption and dozens have since been identified as endocrine disrupting chemicals. This includes several other bisphenols, phthalates (used as plasticizers), benzophenones (UV filters), and certain phenolic antioxidants, such as 2,4-ditertbutylphenol. For example, strong scientific evidence links bisphenols to cardiovascular diseases, diabetes, and obesity. Accordingly, there is a strong interconnection between plastic chemicals and endocrine disruption."

"Additional groups of plastic chemicals emerge as health concern. The Minderoo-Monaco Commission's recent report comprehensively assesses the health effects of plastics across the life cycle, including plastic chemicals. In addition to phthalates and bisphenols, the report highlights per- and polyfluoroalkyl substances (PFAS) widely utilized for their non-stick and water-repellent properties. PFAS are strongly associated with an increased risk of cancer, thyroid disease, and immune system effects, including reduced vaccine efficacy in children. Additional concerns pertain to their persistence and their tendency to bioaccumulate in humans. In addition, brominated and organophosphate flame retardants have been linked to neurodevelopmental effects and endocrine disruption, adversely affecting cognitive function and behavior in children, as well as thyroid and reproductive health. Several other plastic chemicals are known to cause harm to human health, for example because they are mutagens (e.g., formaldehyde) or carcinogens with other modes of action, like melamine."

"Plastic chemicals also impact human health when released from production and disposal sites. These more indirect effects include the contribution of plastic chemicals to water and air pollution across the life cycle. For instance, chlorofluorocarbons, previously used as blowing agents in plastic production, can deplete the stratospheric ozone layer and thereby indirectly affect human health. Other issues include the promotion of antimicrobial resistance due to the dispersion of biocides transferring from plastics in the environment and the release of dioxins and PCBs from the uncontrolled burning of plastic wastes. The latter are especially toxic and persistent, and accumulate in the food chain, leading to increased human exposure."

"The health impacts of well-researched plastic chemicals are established. Arguably, there is a large body of evidence that links certain groups of plastic chemicals to a range of adverse health effects. These include but is not limited to bisphenols, phthalates, PFAS, and brominated and organophosphate flame retardants. Research focusses particularly on their endocrine disrupting effects, include adverse impacts on reproduction, development, metabolism, and cognitive function. However, it should be noted that research into other groups of plastic chemicals and other types of health effects remains largely fragmented and has rarely been systematically assessed. Here, initiatives such as the Plastic Health Map75 can support a more strategic approach."

"Plastic chemicals exert a host of adverse impacts on wildlife. This includes both acute and chronic toxicity in individual organisms and populations, as well as indirect effects across food webs. Ecotoxicological effects of heavy metals, such as cadmium and lead, as well as endocrine disrupting chemicals used in plastics, such as bisphenols, phthalates, and brominated flame retardants, have received the most research attention to date. Oftentimes, these endocrine disrupting chemicals induce environmental impacts at very low concentrations."

Jumping to page 24 for "Key Findings" of Part II of the report, "What is known about plastic chemicals":

"There are at least 16,000 known plastic chemicals. The report identifies 16,325 compounds that are potentially used or unintentionally present in plastics."

"There is a global governance gap on plastic chemicals. 6% of all compounds are regulated internationally and there is no specific policy instrument for chemicals in plastics."

"Plastic chemicals are produced in volumes of over 9 billion tons per year. Almost 4000 compounds are high-production volume chemicals, each produced at more than 1000 tons per year."

"At least 6300 plastic chemicals have a high exposure potential. These compounds have evidence for their use or presence in plastics, including over 1500 compounds that are known to be released from plastic materials and products."

"Plastic chemicals are very diverse and serve multiple functions. In addition to well-known additives, such as plasticizers and antioxidants, many plastic chemicals often serve multiple functions, for instance, as colorants, processing aids, and fillers."

"Grouping of plastic chemicals based on their structures is feasible. Over 10,000 plastic chemicals are assigned to groups, including large groups of polymers, halogenated compounds, and organophosphates."

Jumping to page 28, they have a visualization of the number of different plastic chemicals by use category:

3674 Colorants
3028 Processing aids
1836 Fillers
1741 Intermediates
1687 Lubricants
1252 Biocides
959 Monomers
897 Crosslinkers
883 Plasticizers
862 Stabilizers
843 Odor agents
764 Light stabilizers
723 Catalysts
595 Antioxidants
478 Initiators
389 Flame retardants
215 Heat stabilizers
205 Antistatic agents
128 Viscosity modifiers
103 Blowing agents
83 Solvents
74 Other additives
56 NIASs (non-intentionally added substances)
47 Others
31 Impact modifiers

On page 30 they have a table that gives you numbers by chemical category (with many groups missing because apparently there is a "long tail" of categories with fewer than 10 members that they didn't bother to include):

802 Alkenes
443 Silanes, siloxanes, silicones
440 PFAS (per- and polyfluoroalkyl substances)
376 Alkanes
202 Carboxylic acids salts
140 PCBs (polychlorinated biphenyls)
124 Aldehydes simple
89 Azodyes
75 Dioxines and furans
66 Alkylphenols
61 Ortho-phthalates
52 Aceto- and benzophenones
50 Phenolic antioxidants
45 PAHs (polycyclic aromatic hydrocarbons)
34 Bisphenols
29 Iso/terephthalates and trimellitates
28 Benzotriazoles
25 Ketones simple
24 Benzothiazole
22 Aromatic amines
20 Alkynes
20 Alkane ethers
18 Chlorinated paraffins combined
15 Aliphatic ketones
14 Aliphatic primary amides
11 Salicylate esters
10 Parabens
10 Aromatic ethers

Page 57 has a table of the number of chemicals by category considered hazardous. Page 61 has a table of the number of chemicals considered hazardous by usage category instead of chemical structure.

Last section is policy recommendations.

The report is 87 pages, but the document is 181 pages. The rest is a series of appendices, which they call the "Annex", which has the glossary, abbreviations, and detailed findings for everything summarized in the rest of the report.

https://plastchem-project.org/

#discoveries #chemistry #health #environment

waynerad@diasp.org

"Prickly paddy melon weed enzymes show potential as sustainable cement alternative." "An invasive weed that has long plagued the Australian agricultural industry could become a game-changing economic crop due to its potential to produce a cement alternative. Prickly paddy melon costs the agricultural industry around $100 million a year in lost grain yields, cattle deaths and control measures."

"But now researchers say enzymes produced by the paddy melon could be used to create a more sustainable form of cement and prevent soil erosion."

It took me a while to figure out what this was about. What it's about is a type of enzymes called urease enzymes. The reaction urease enzymes catalyze is urea + water = ammonia + carbon dioxide. What this has to do with concrete is there's this concept in concrete of "self healing" concrete, which works (to a limited extent) by having enzymes in the concrete that, when combined with water, precipitate calcium carbonate. Astute observers amoung you will at this point point out that calcium carbonate is not part of the chemical reaction that urease enzymes catalyze. Obviously you also need calcium present to precipitate calcium carbonate, but the real key is that the ph level is changed (more specifically increased, more specifically by the ammonia) such that dissolved calcium ions in the concrete will react with the carbon dioxide to precipitate calcium carbonate.

For those of you who like chemical formulas (helps me understand what's going on but I hear some people are scared off by formulas) , the reaction that the urease enzymes catalyze is:

(NH2)2CO + H2O -> 2NH3 + CO2

(that is, two of the (NH2) groups -- the lack of subscripts can be confusing).

And the formula for calcium carbonate is CaCO3.

What does all this have to do with the Australian weed prickly paddy melon? Apparently it's possible to produce these urease enzymes in massive quantities by extracting them from this plant.

I was disappointed by this as we humans consume crazy amounts of sand and are depleting the planet of sand for use in concrete, and I was hoping this would help with that. But no. In fact if you chase down the actual research paper (or the abstract, the full paper is paywalled), you'll find the researchers were primarily interested in urease enzymes for soil. Since ammonia increases pH, if a soil is acidic (remember, pH numbers under 7.0 are acidic, the lower the more acidic, while pH numbers above 7.0 are basic/alkaline, the higher the more basic/alkaline), adding these enzymes can reduce the acidity. Since ammonia is a nitrogen compound, it also helps to make nitrogen available for crops.

So, this won't help with concrete production from sand, and the article doesn't even try but instead talks about carbon footprint. Might be useful for limited "self-healing" concrete, and probably most useful for agricultural crop soil.

Prickly paddy melon weed enzymes show potential as sustainable cement alternative

#discoveries #chemistry #agriculture

waynerad@diasp.org

How "top heavy" is YouTube? 65% of videos have fewer than 100 views. 87% have fewer than 1,000 views. Only 3.7% of videos exceed 10,000 views, which is the threshold for monetization. Those 3.7% of views get 94% of views. The top 0.16% of videos get 50% of video views.

In other words, video views on YouTube follow a power law distribution, as you might have expected, but it's a lot steeper than you might have expected.

How was this figured out? Using a new but simple technique called "dialing for videos".

You may not realize it, but those YouTube IDs that look like a jumble of letters and numbers, like "A-SyeJaMMjI", are actually numbers. Yes, all YouTube video IDs are actually numbers. They're just not written in base 10. They're 64-bit numbers written in base 64. If you're wondering how YouTube came up with 64 digits, think about it: digits 0-9 give you 10, then lower case letters a-z give you 26 more, bringing you up to 36, then uppercase letters A-Z give you 26 more, getting you up to 62. You still need 2 more for that, and YouTube chose the dash ("-") and the underscore ("_").

Because the numbers are randomly chosen 64-bit numbers, there are 2^64 possibilities, which in decimal is 18,446,744,073,709,551,616. That's much too large to try every number or even numbers at random. But the researchers discovered a quirk. Through the YouTube API they could do searches, and YouTube would do the search in a case-insensitive way. Well, except not for the last character for some reason. And it would allow 32 IDs to be searched on in the same query. So the researchers were about to find 10,000 videos (well, 10,016 actually) by doing millions of searchers. This collection of 10,000 videos is likely to be more representative of all of YouTube than any other sample academic researchers have ever hard. All previous attempts have resulted in biased results because they were influenced either by the recommendation system, personalized search results, or just whatever secret algorithms YouTube has that determines how it ranks videos that it enables you to find.

How big is YouTube? Their estimate is 9.8 billion videos. Or at least that's how big it was between October 5, 2022, and December 13, 2022, which is when they did their data collection. Their paper was finally published last December.

By looking at what percentage of their sample were uploaded in any given year, they can chart the growth of YouTube:

Year - Percentage of sample
2005 - 0.00%
2006 - 0.05%
2007 - 0.22%
2008 - 0.43%
2009 - 0.74%
2010 - 1.13%
2011 - 1.67%
2012 - 1.86%
2013 - 1.97%
2014 - 2.34%
2015 - 3.02%
2016 - 4.25%
2017 - 5.39%
2018 - 6.73%
2019 - 8.81%
2020 - 15.22%
2021 - 20.29%
2022 - 25.91%

Translating those numbers into millions of videos (remember, a thousand million is a billion), we get this list:

2005 - 0
2006 - 5
2007 - 27
2008 - 69
2009 - 142
2010 - 254
2011 - 418
2012 - 602
2013 - 796
2014 - 1,072
2015 - 1,325
2016 - 1,745
2017 - 2,278
2018 - 2,943
2019 - 3,813
2020 - 5,316
2021 - 7,321
2022 - 9,881

73% of videos had no comments. 1.04% of videos had 100 comments or more, and those accounted for 55% of all comments in the sample.

"Likes" are evn more skewed, with 0.08% of videos getting 55% of likes.

YouTube disabled the "Dislike" buttons in 2021.

Most channels had at least one subscriber and the average was 65. Subscriber counts, while less "top heavy", turned out to be weakly correlated with views. The researchers estimate 70% of views of any given video come from algorithms and not from subscribers or external links pointing to a video.

Median video length was 615 seconds (10 minutes, 15 seconds). 6.2% were 10 seconds or less, 38% were 1 minute or less, 82% were ten minutes or less, and only 3.9% were an hour or more.

Words that occurred most in metadata tags included "Sony" and "Playstation".

The researchers employed hand-coders to hand-code a subsample of 1,000 videos. They found only 3% of videos had anything to do with news, politics, or current events. 3.8% had anything to do with religion. 15.5% had just still images for the video part. (I actually see a lot of music videos like this -- just an album cover or photo of the artist and the rest is audio.). 19.5% were streams of video games. 8.4 was computer-generated but not a video game. 14.3% had a background design indicating they were produced on some sort of set. 84.3% were edited. 36.7% had text or graphics overlaid on the video. 35.7% was recorded indoors. 18.1% was recorded outdoors. (The remainder were both or unclear.) Cameras were "shaky" 52.3% of the time. A human was seen talking to the camera 18.3% of the time. 9.1% of videos recorded a public event. The video was something obviously not owned by the uploader, such as a movie clip, 4.8% of the time.

Sponsorships and "calls to action" were only present in 3.8% of videos.

96.8% of videos had audio. 40.5% were deemed by coders to be entirely or almost entirely music. Many of these were backgrounds for performances, video game footage, or slide shows.

53.8% had spoken language. 28.9% had spoken language on top of music.

For languages, "we built our own language detection pipeline by running each video's audio file using the VoxLingua107 ECAPA-TDNN spoken language recognition model."

Language distribution was:

English: 20.1%
Hindi: 7.6%
Spanish: 6.2%
Welsh: 5.7%
Portuguese: 4.9%
Latin: 4.6%
Russian: 4.2%
Arabic: 3.3%
Javanese: 3.3%
Waray: 3.2%
Japanese: 2.2%
Indonesian: 2.0%
French: 1.8%
Icelandic: 1.7%
Urdu: 1.5%
Sindhi: 1.4%
Bengali: 1.4%
Thai: 1.2%
Turkish: 1.2%
Central Khumer: 1.1%

"It is unlikely that Welsh is the fourth most common language on YouTube, for example, or that Icelandic is spoken more often than Urdu, Bengali, or Turkish. More startling still is that, according to this analysis Latin is not a 'dead language' but rather the sixth most common language spoken on YouTube. Of the top 20, Welsh, Latin, Waray-Waray, and Icelandic are not in the top 200 most spoken languages, and Sindhi and Central Khmer are not in the top 50 (Ethnologue, 2022). The VoxLingua107 documentation notes a number of languages which are commonly mistaken for another (Urdu for Hindi, Spanish for Galician, Norwegian for Nynorsk, Dutch for Afrikaans, English for Welsh, and Estonian for Finnish), but does not account for the other unusual results we have seen. We thought that some of the errors may be because of the amount of music in our sample, but removing the videos that are part of YouTube Music (which does not include all music) did not yield significantly different results."

"It is worth highlighting just how many of the most popular languages are not among the languages available in the YouTube autocaptioning system: Hindi, Arabic, Javanese, Waray-Waray, Urdu, Thai, Bengali, and Sindhi."

Dialing for videos: A random sample of YouTube

#solidstatelife #discoveries #winnertakeall #powerlawdistribution #youtube

waynerad@diasp.org

"Attention deficits linked with proclivity to explore while foraging".

"Our findings suggest that ADHD attributes may confer foraging advantages in some environments and invite the possibility that this condition may reflect an adaptation favouring exploration over exploitation."

I first encountered the "exploration over exploitation" in the context of reinforcement learning in computer science. The basic idea is, should you go to your favorite restaurant, or a restaurant you've never been to before? If you're in a new city where you've been to few restaurants, you should probably go to a new one. If you're in a city where you've lived for 10 years, and have been to most restaurants, maybe just go to the favorite. Where is the crossover point in between? You get the idea. Do you "exploit" the knowledge you have already, or do you "explore" to obtain more knowledge?

For the simplest cases, mathematicians have come up with formulas, and for complex cases, computer scientists have run simulations. In reinforcement learning, algorithms often have a tunable "hyperparameter" that can be used to increase the "exploration". Some problems require more "exploration" than others.

It appears the process of evolution may have evolved a variety of personalities to prioritize "exploration" or "exploitation".

Attention deficits linked with proclivity to explore while foraging

#discoveries #psychology #evolution

waynerad@diasp.org

"Antimatter: Scientists freeze positronium atoms with lasers."

Whoa.

"Positronium" has a negatively charged electron and a positively charged antimatter electron, which is called a positron. A regular hydrogen atom is made up of a positively charged proton and negatively charged electron.

So you get something like an atom, with a negatively charged electron and a positively charged particle, it's just that the positively charged particle is an antielectron (positron) instead of a proton. And because an antielectron (positron) has the same mass as a regular electron, the combination is lighter than a proton-electron pair, also known as a hydrogen atom.

I didn't know positronium even existed until I read this (you all have probably know for ages, huh?) so I did some digging to learn more about it. The way the process works is, first the "atom smasher" (really subatomic particle smasher) at CERN smashes subatomic particles together, and if the collisions are sufficiently high-energy, some of that energy (according to the famous E = mc^2 formula) gets converted into matter -- and not just any matter but particle-antiparticle pairs.

The next step is to slow them down. CERN has an Antiproton Decelerator which evidently can also slow down antielectrons (positrons).

"The Antiproton Decelerator is a ring composed of bending and focussing magnets that keep the antiprotons on the same track, while strong electric fields slow them down. The spread in energy of the antiprotons and their deviation from their track is reduced by a technique known as 'cooling'. Antiprotons are subjected to several cycles of cooling and deceleration until they are slowed down to around a tenth of the speed of light. A newer deceleration ring, ELENA (Extra Low ENergy Antiproton), is now coupled with the Antiproton Decelerator. This synchrotron, with a circumference of 30 metres, slows the antiprotons even more, reducing their energy by a factor of 50, from 5.3 MeV to just 0.1 MeV. An electron cooling system also increases the beam density. With ELENA, the number of antiprotons that can be trapped increases by a factor of 10 to 100."

The next step involves trapping the antielectrons in a Penning-Malmberg trap. This is complicated. Let's just say it works by using a combination of magnetic and electric fields to control the trajectory of the particles.

The next step involves producing positronium by combining the antielectrons (positrons) with regular electrons. This is done by directing them into silicon (silicon dioxide) with special "nanochannels" in it. I don't understand how this works. You would think funneling a boatload of antimatter into regular matter would just cause the antimatter and regular matter to meet and annihilate. But the silicon dioxide with the "nanochannels" is "porous" in such a way that a "quantum confinement" effect takes place that allows the antielectrons (positrons) to combine with regular electrons.

Finally we get to the step this news article is about: the laser cooling.

The way laser cooling works is by directing lasers at a material from from all 6 directions: all 4 sides plus above and below. The laser light is at a frequency that matches an energy transition of the atom. If positronium can be called an "atom". It can be treated like an "atom" for laser cooling purposes. Atoms have energy levels and when light comes in at the same frequency as a transition between energy levels, the atom can absorb the energy, in the form of absorbing a photon, and make the energy transition. Later it can "fall back" to the lower energy state and re-emit the photon. The key is that the atom absorbs light from a particular direction but emits it in a random direction. Because absorbing and emitting photons changes the momentum of the atom, this reduces the momentum of the particle in the direction of the laser. By doing lasers in all 6 directions, a cooling effect can be achieved on the atoms. In this experiment they specifically say they used a alexandrite laser that produces 243-nanometer wavelegth light, that corresponds to the transition between the ground state and the next energy level above the ground state.

The cooling effect is further enhanced with a Doppler effect. When an atom is moving towards a laser, light that it emits is blueshifted, according to its velocity. The same is true of light that it absorbs. By tuning the laser to a frequency slightly higher (or alternatively, to a wavelength slightly lower) than the frequency normally absorbed by the atom, the setup can be made so the laser's photons are only absorbed by atoms moving towards the laser.

In this experiment they reported they managed to drop the temperature of the positronium from 380 kelvin to 170 kelvin. By human standards, that means going from very hot (225 Fahrenheit / 105 Celsius) to extremely cold (-150 Fahrenheit / - 100 Celsius).

They want to make it even colder. Maybe that is made difficult by the fact that positronium usually only lasts for 142 nanoseconds before it annihilates into 3 gamma rays?

Oh wait, did I say positronium has only one ground state? It actually has two: one that lasts 142 nanoseconds and one that lasts 125 picoseconds. Remember, your metric prefixes go milli-, micro-, nano-, pico-, each one a factor of 1,000 smaller than the next. All the laser cooling was done using the long-lived ground state, the 142 nanosecond one (which is called 1^3 S -- the other is called 1^1 S. The laser cooling was done using the 1^3 S to 2^3 P transition.)

What does all this portend for the future?

Being able to sufficiently supercool positronium would enable experiments with precision spectroscopy on antimatter, measuring all the light that gets emitted from all the energy level transitions with extreme precision. This would enable testing of the QED theory (quantum electrodynamics) to an unprecedented level. "Bound state" QED, where you have particles bound together, such as here where we have an antielectron (positron) and a regular electron together, predicts these energy transitions using the universe's fine structure constant (that dimensionless constant you've heard of that is approximately 1/137). So this would enable deep testing of QED and precision measurement of the fine structure constant, as well as matter-antimatter symmetry, because we could see if the spectra for antielectrons (positrons) is the same as regular electrons.

You would think the spectra for antielectrons (positrons) would be exactly the same as regular electrons, and current Standard Model theory predicts it will. But, matter and antimatter are not always perfectly symmetrical mirror images of each other. In fact it's obvious they can't be, because we live in a universe with matter, a consequence of an asymmetry between matter and antimatter during the big bang.

If you're wondering why the material needs to be super cold for high precision spectroscopy, it's because the same Doppler effect mentioned earlier causes spectral lines to spread out when a material has a lot of kinetic energy and along with that, a lot of variation in the velocities of the particles.

Finally, there's the possibility of a gamma-ray laser. So, there's a theory that if you could cool positronium all the way down to being a Bose-Einstein condensate, then the "atoms" of positronium would become coherent in such a way that when they annihilate and release their 3 (in the case of the long-lived 1^3 S ground state positronium) or 2 (in the case of the short-lived 1^1 S ground state positronium) gamma rays, the gama rays can stimulate other "atoms" of positronium to annihilate as well. Bose-Einstein condensate is a state of matter near absolute zero in temperature where atoms enter the same quantum state and share a wave function. This would result in the emission of coherent gamma-ray light -- a gamma-ray laser, in other words. A gamma-ray laser has never existed before.

Antimatter: Scientists freeze positronium atoms with lasers

#discoveries #physics #cern #antimatter

waynerad@diasp.org

New technique for killing cancer cells by getting them to vibrate in unison using light invented.

"The atoms of a small dye molecule used for medical imaging can vibrate in unison -- forming what is known as a plasmon -- when stimulated by near-infrared light, causing the cell membrane of cancerous cells to rupture. According to the study published in Nature Chemistry, the method had a 99 percent efficiency against lab cultures of human melanoma cells, and half of the mice with melanoma tumors became cancer-free after treatment."

"Near-infrared light can penetrate far deeper into the body than visible light, accessing organs or bones without damaging tissue." "Near-infrared light can go as deep as 10 centimeters (~ 4 inches) into the human body as opposed to only half a centimeter (~ 0.2 inches), the depth of penetration for visible light."

"The molecular jackhammers are aminocyanine molecules, a class of fluorescent synthetic dyes used for medical imaging."

"These molecules are simple dyes that people have been using for a long time. They're biocompatible, stable in water and very good at attaching themselves to the fatty outer lining of cells. But even though they were being used for imaging, people did not know how to activate these as plasmons."

"Due to their structure and chemical properties, the nuclei of these molecules can oscillate in sync when exposed to the right stimulus. I saw a need to use the properties of plasmons as a form of treatment and was interested in James Tour's mechanical approach to dealing with cancer cells. I basically connected the dots."

"The molecular plasmons we identified have a near-symmetrical structure with an arm on one side. The arm doesn't contribute to the plasmonic motion, but it helps anchor the molecule to the lipid bilayer of the cell membrane."

The paper is paywalled so I'm just going by what's in the press release.

Molecular jackhammers' 'good vibrations' eradicate cancer cells.

#discoveries #chemistry #medicine #cancer

waynerad@diasp.org

CRISPR has been authorized for use as a gene therapy for sickle-cell disease and transfusion-dependent beta-thalassemia by the UK's Medicines and Healthcare products Regulatory Agency (MHRA). Not the FDA because this is the UK we're talking about.

"I am pleased to announce that we have authorised an innovative and first-of-its-kind gene-editing treatment called Casgevy, which in trials has been found to restore healthy haemoglobin production in the majority of participants with sickle-cell disease and transfusion-dependent beta-thalassaemia, relieving the symptoms of disease."

"Casgevy is designed to work by editing the faulty gene in a patient's bone marrow stem cells so that the body produces functioning haemoglobin. To do this, stem cells are taken out of bone marrow, edited in a laboratory and then infused back into the patient after which the results have the potential to be life-long."

MHRA authorises world-first gene therapy that aims to cure sickle-cell disease and transfusion-dependent beta-thalassemia

#discoveries #crispr

johnehummel@diasp.org

Did the intellect of the human species reach its apex in the 1970s (and fall since then)?

What NASA did in 1977 beats the shit out of what Space X has done to date.

The precision of the science is breathtaking.

All done with only the most primitive computers and data storage technology.

If I were a cosmologist, I'm not sure I wouldn't find this trivial or dangerously wrong. But I'm not, so I find it pretty amazing and telling.

I am given to suspect that computers have made us lazy and stupid. (How else could we view Musk as a genius or Trump as a leader?)

NASA Warns That Voyager 1 Has Made “Impossible” Discovery after 45 Years in Space

#NASA #Voyager #Discoveries

https://www.youtube.com/watch?v=JhRyu-sSgus

waynerad@diasp.org

"Superlensing without a superlens: microscopes boosted beyond limits".

So what this is about is, in light microscopes, the limit of resolution is half the wavelength of light used. Fine details cannot be seen below this limit. This technique breaks through the diffraction limit by a factor of nearly four times. This has the potential to greatly enhance biomedical imaging.

There is special interest in terahertz frequencies for biomedical imaging because light at terahertz frequencies has the ability to penetrate tissues deeper than visible light. It interacts differently with water and various biological materials, enabling images with good contrast. It is non-ionizing, which means it doesn't harm the tissues it penetrates. If hooked up to a spectrometer, it can be broken out into a spectrum and specific molecules identified. But it has a longer wavelength than visible light, so the diffraction limit is a problem.

This brings us to the virtual superlensing effect. The key to understanding this is to think of there being two types of light waves reflected off the object: propagating waves and evanescent waves. The "propagating waves" are what you normally think of as light waves and are what you use to see the object -- but you can only see features that are above the diffraction limit. The "evanescent waves" are the waves that carry the fine detail information, but they decay exponentially as you move away from the object under observation. The virtual superlens, however, is a clever way to capture the information carried by these "evanescent waves". The system uses Fourier expansions to mathematically "reverse" the phase of the "propagating waves" and simultaneously amplify the "evanescent waves". The amplification relies on using polarized light, which causes imaginary numbers to show up in the computation, which is what you want that enables the "superlensing" effect to happen.

There is no physical lens. The virtual superlens is all calculations with Fourier expansions done as a post-processing step on a computer, after the measurement itself. If you're familiar with Fourier series, that break a time-based signal into its frequency components, the Fourier expansion technique here is related and works with a time-based 3-dimensional electric field rather than a single-dimension time-based signal.

Superlensing without a superlens: microscopes boosted beyond limits

#discoveries #microscopy #terahertz #diffractionlimit #biomedical

waynerad@diasp.org

A CRISPR-free DNA editing system has been invented. It is claimed it can do effective base editing in the nucleus, mitochondria, and chloroplast genomes of plant and human cells.

The alternative system is called CyDENT and it's based on a preceding technology TALE. "TALE" stands for "transcription-activator-like effector". Before explaining what CyDENT is about, it might first be worthwhile to explain a bit about TALE. TALE proteins, which were discovered in bacteria (Xanthomonas bacteria, if you care to know), have the special property that they can be reconfigured to match any DNA sequence. To make these proteins work as gene editors, they are fused to something called a deaminase. A deaminase, in turn, is a fancy word that means a protein that can convert one DNA base to another. For example there are deaminases that convert C to T.

The shortcoming of this system is that the deaminases are themselves double-stranded pairs of DNA. The bases that they edit likewise have to be double-stranded. The new CyDENT base editing system fuses TALES with a single-strand-specific cytidine deaminase that has been found called FokI nickase.

There are some additional components: an exonuclease and a uracil glycosylase inhibitor. Exonucleases pop nucleotides off the end of a DNA strand one at a time. They are called "exo" because there's such a thing as endonucleases, which operate in the middle of a DNA strand. There are different exonucleases depending on which end of the DNA strand you want to pop, and there are exonucleases for RNA, too.

I have no idea what the uracil glycosylase inhibitor is for. A uracil glycosylase inhibitor is a protein that inhibits uracil DNA glycosylase, which is an enzyme that removes uracil from DNA. It's considered a "repair" enzyme. RNA uses uracil while DNA uses thymine. So if there's any uracil in DNA, it shouldn't be there.

Ok, at this point we turn our attention back to the main actor, the FokI nickase. It's called a "nickase" because it "nicks" DNA strands. Great name, eh? In case you're wondering, no, the "Fokl" part of the name doesn't come from the great Dr. Fokl, it comes from the name of the bacteria where it was discovered, Flavobacterium okeanokoites.

So the idea is that the TALE proteins find the target strand of DNA, the Fokl nickase "nicks" the DNA strand, exposing the desired region, the exonuclease pops nucleotides off one strand, leaving a short single stranded DNA segment. The DNA deaminase steps in here and modifies the DNA in the single-stranded segment.

If you're wondering what the point of all this is, it allows editing of single-stranded DNA in the absense of a bunch of stuff CRISPR requires, in particular the Cas9-guided R-loop structure and double-stranded DNA deaminases mentioned in the article. The R-loop structure is a complex 3-dimensional structure that forms when the Cas9 protein that the CRISPR system relies on finds its DNA target. Believe it on not, this structure involves the RNA strand that is the "guide" for the CRISPR editing process to form a "hybrid" double helix with one of the DNA strands -- the the "R-loop" actually consists of an RNA-DNA helix and a single-stranded DNA loop that is intertwined with the RNA-DNA helix.

Apparently it is really crucial for the system to be RNA-free, as that is the key to enabling it to work in the nucleus and in "organellar genomes" such as mitochondria and plant chloroplasts.

This might be a good time to tell you what CyDENT stands for, now that you'll understand all the terms in it. It stands for "cytosine deamination by nicking and editing technology".

Scientists Develop novel base editors - Institude of Genetics and Developmental Biology, Chinese Academy of Sciences

#discoveries #biology #genetics #geneediting #crispr

waynerad@diasp.org

"Food brands owned by tobacco companies -- which invested heavily into the US food industry in the 1980s -- appear to have 'selectively disseminated hyperpalatable foods' to American consumers."

"While tobacco companies divested from the US food system between the early to mid-2000s, perhaps the shadow of Big Tobacco has remained. The new KU study finds the availability of fat-and-sodium hyperpalatable foods (more than 57%) and carbohydrate-and-sodium hyperpalatable foods (more than 17%) was still high in 2018, regardless of prior tobacco ownership, showing these foods have become mainstays of the American diet."

"The majority of what's out there in our food supply falls under the hyperpalatable category. It's actually a bit difficult to track down food that's not hyperpalatable. In our day-to-day lives, the foods we're surrounded by and can easily grab are mostly the hyperpalatable ones. And foods that are not hyperpalatable, such as fresh fruits and vegetables -- they're not just hard to find, they're also more expensive."

More specifically, what the study did was use United States Department of Agriculture (USDA) data sets were used to identify hyper-palatable foods that were and were not owned by US tobacco companies from 1988 to 2001. They found 105 that were and 587 that were not owned by tobacco companies.

"Tobacco-owned foods were 29% more likely to be classified as fat and sodium hyper-palatable foods and 80% more likely to be classified as carbohydrate and sodium hyper-palatable foods than foods that were not tobacco-owned between 1988 and 2001 (P-values = .005-0.009). The availability of fat and sodium hyper-palatable foods (>57%) and carbohydrate and sodium hyper-palatable foods (>17%) was high in 2018 regardless of prior tobacco-ownership status, suggesting widespread saturation into the food system."

Study shows food from tobacco-owned brands more 'hyperpalatable' than competitors' food | The University of Kansas

#discoveries #nutrition

waynerad@diasp.org

"Researchers are finding signs of REM sleep in a broader array of animals than ever before: in spiders, lizards, cuttlefish, zebrafish. The growing tally has some researchers wondering whether dreaming, a state once thought to be limited to human beings, is far more widespread than once thought."

"Marsupial mammals called echidnas show characteristics of REM and non-REM sleep at the same time. Reports on whales and dolphins suggest that they may not experience REM at all. Birds have REM sleep, which comes with twitching bills and wings and a loss of tone in the muscles that hold up their heads."

"Researchers reported a sleep-like state in cuttlefish."

"Researchers have since observed a similar state in octopuses."

"Researchers have also observed a REM-like stage in bearded dragons by recording signals from electrodes in their brains. And they have reported at least two sleep states in zebrafish based on the fishes' brain signatures."

"In pigeons, sleep scientist Gianina Ungurean of the Max Planck Institute for Biological Intelligence in Munich and the University Medicine Göttingen has observed, with colleagues, that pupils constrict during REM as they do during courtship behavior."

"Roundworms appear to have one sleep state only."

A look into the REM dreams of the animal kingdom

#discoveries #biology #sleep #rem #dreaming

waynerad@diasp.org

The "um"s of regular people follow a Poisson distribution, but the "um"s of professional presenters don't. But when the professional presenters are no longer doing professional presenting and doing unscripted Q&A, their "um"s -- or more precisely, the time intervals between them -- follow a Poisson distribution just like regular people. The Poisson distribution is a statistical distribution that assumes events are random and independent but happen at a predictable rate.

I found a weird pattern in how people 'Uhmmm" - Not David

#discoveries #statistics #poissondistribution #communication

waynerad@diasp.org

"The illusion of the mind-body divide is attenuated in males."

Hmm that's weird. This research wasn't originally motivated by interest in difference between males and females, it was motivated by testing a theory about Dualism -- whether Dualism is innate or learned.

"People are intuitive Dualists -- they tend to consider the mind as ethereal, distinct from the body. For example, people believe that one's psychological traits (e.g., thinking about one's wife) can persist in the afterlife, without one's body. Conversely, if one's body were to be duplicated, the replica, people state, would preserve one's physical characteristics (e.g., a scar), but not one's knowledge and beliefs."

The hypothesis to be tested is that intuitive Dualism is natural for humans because "it arises from two core principles that lie deep within human cognition. One such principle is intuitive physics; the other is theory of mind."

"Per intuitive physics, people consider objects as cohesive entities that can only move by immediate contact, and this knowledge manifests in early infancy. For example, upon seeing one moving ball contact another (stationary) ball, infants expect the stationary ball to launch immediately. So, if the launch is delayed, infants are surprised -- a response evident already in newborn."

"Theory of mind, however, leads us to attribute the actions of agents to their mental states -- to their beliefs, knowledge, desires and goals. Thus, upon seeing a person reach their hand towards a water bottle, one would spontaneously infer that the person believes there is water in the bottle, and it is this belief that caused their hand to move. This inference, however, blatantly violates intuitive physics, as it presumes that the agent's hand (a physical object) can move spontaneously, in the absence of contact with another physical objects. And that violation of intuitive physics is bound to elicit cognitive tension. To resolve the dissonance, people might presume that mental states -- the causes of the agents' movement -- aren't physical. Thus, the tension between intuitive theory of mind and intuitive physics can explain how intuitive Dualism arises naturally."

"The hypothesis that Dualism is natural also generates a testable prediction. If Dualism arises (in part) from theory of mind, then if theory of mind were to be attenuated, then so should intuitive Dualism."

They recount previous research showing autistic people are less adept at reading the minds of others.

"If autism attenuates theory of mind, and theory of mind begets Dualism, then in people with autism, the mind -- body divide ought to be attenuated. Moreover, Dualist intuitions ought to correlate with sensitivity to theory of mind. This, is exactly what was found."

Here's where they decided to make this a gender experiment.

"To further evaluate the origins of Dualism, and its link to theory of mind, here, we explore systematic gradations in theory of mind occurring within the neurotypical population -- differences between females and males. Numerous studies have found that males underperform females in theory of mind tasks."

In Experiment 1, participants are invited to imagine if a replica of their body would have their traits. Experiment 2 does the reverse, and ask people to imagine the afterlife, after the demise of the body, and what traits would continue. Half of the traits were about actions and emotions (e.g., walking, anger) -- traits that people readily anchor in the body (e.g., legs, face), and the other half were about knowledge and beliefs, traits not linked to any particular bodily organ (e.g., having a concept of a person, forming sentences). There were 80 such traits on the list.

In Experiment 3, participants were simply asked whether each trait was "inborn", yes or no. "Inborn traits are ones that develop in humans spontaneously. Some of these traits (e.g., having five fingers) are present at birth, but others (e.g., facial hair in men) can appear later in development. All inborn traits, however, emerge in the typical course of development, even if an individual has never had the opportunity to witness these behaviors in other people."

The hypothesis to be tested in Experiment 3 was the idea that Dualism promotes Empiricism -- but here "Empiricism" has a funny definition, which is "the belief that psychological traits cannot be innate."

"Males, in this sample, considered the psyche as more strongly embodied than females: they believed that epistemic states are more likely to emerge in a replica of one's body (in Experiment 1) and that psychological traits are less likely to persist upon the body's demise, in the afterlife (in Experiment 2). Experiment 3 further showed that males are also more likely to consider psychological traits as innate... suggesting that Dualism begets Empiricism."

They said they followed this up with theory of mind tests and males in this sample scored lower than females on theory of mind, and their theory of mind scores correlated with their Dualist intuitions.

Also possibly of note, this was done at Northeastern University and Northeastern University has so many female psychology students that the researchers had to use an outside company to recruit male participants. A total of 242 people participated in the experiment.

The illusion of the mind–body divide is attenuated in males

#discoveries #psychology #dualism

waynerad@diasp.org

The complete genomes of 240 mammal species have been sequenced. Along with a "whole-genome alignment", where they line up the DNA sequences in one species with the equivalent DNA sequence in another species. The researchers identified 4,552 "conserved" regions. When talking about DNA, a region is considered either "fast-evolving" if it changes rapidly, or "conserved" if it stays the same over time. If a region is "conserved", there's probably a reason -- the sequence needs to be the way it is because it encodes some essential function. Knowing that a region is "conserved", however, doesn't tell you what the reason is. It's just a clue that there's something important going on. And just as a region can be "conserved" within a species over time, a region conserved across related species is probably conserved for a reason. But you don't necessarily know what the reason is.

Examining the conserved regions, the researchers found regions that relate to hibernation, olfaction, vocal learning, and brain size. Not only that, but these evolved multiple times -- sort of. For example, hibernation evolved in both bears and bats. But it is thought that the underlying "coding" genes are conserved across mammal species -- it's the non-coding DNA that controls what genes get switched "on" and "off" that is thought to have evolved independently. "Coding" means the DNA encodes proteins which are assembled by ribosomes after the information on the DNA is transcribed to messenger RNA (mRNA) and ferried over to the ribosomes. "Non-coding" DNA doesn't encode for proteins. It was thought to be "junk" before it was realized that often this "non-coding" DNA controls the expression of the "coding" DNA, through such mechanisms as transcription factors.

A highly detailed phylogenetic tree of mammal species was made that resolves disputes in previous phylogenetic trees. The effects of the Cretaceous-Paleogene (K-Pg) extinction, which wiped out the dinosaurs and enabled placental mammals to take over land and rapidly diversify, is measurable in the genomes.

In addition to the 4,552 nearly perfectly preserved regions, 10.7% of the human genome was considered "unusually conserved" across mammal species. 57.6% of coding bases fall into this category, but on the flip side, 80.7% of the bases in this category are non-coding. If that makes you scratch your head, review your Bayes' Theorem: A given B is different from B given A. Given a base known to be coding, there's a 57.6% chance it's in the "unusually conserved" category. Give a base known to be "unusually conserved", there's an 80.7% chance it's non-coding.

Approximately 439,461 candidates for "regulatory elements", regions of non-coding DNA which regulate the transcription of other genes, were identified for further study. In addition, 2,024,062 transcription factor binding sites were found. Transcription factor binding sites. Transcription factors are proteins that turn genes "on" and "off" by controlling their "transcription" from DNA to messenger RNA (mRNA) which goes to the ribosome to control the assembly of a protein.

In addition to finding regions that are the same compared with other mammals, the researchers looked for regions that are different -- the so-called human accelerated regions. Conserved within the human lineage, but different from other mammals, especially our closest relatives, the chimpanzees. These are thought to underlie human-specific traits. Some of these are thought to be neurodevelopmental genes.

There are also human-specific conserved deletions, where short (2-3 base pair) deletions took place. 10,032 of these human-specific conserved deletions were identified. Many of these are believed to affect neurons and nerve tissue and have affected our cognitive function.

For alignment, they created a new machine-learning tool that can align genes even if they underwent translocations (where a gene or part of a gene moves to a different chromosome) or inversions (where a portion of a DNA sequence gets copied in the reverse direction), and works for non-coding regions as well as the parts within coding regions that don't encode proteins but are "start" and "stop" indicators and such ("introns" as opposed to "exons").

They also looked at transposable elements, colloquially known as "jumping genes". These are DNA sequences that change their positions within a genome, sometimes with dramatic effects such as altering the cell's genetic identity or even its genome size. The lowest genomic percentage of transposable elements was found in the star-nosed mole (27.6%), and the largest percentage was seen in the aardvark (74.5%), whose higher transposable element count corresponds with larger genome size.

Finally, they looked at species that are nearing extinction and found the small population sizes decrease heterozygosity and increases the deleterious genetic load, making extinction even more likely. To determine deleterious genetic load, they looked at regions known to be vital in familiar species such as mice.

What makes a mammal? 423,000 newly identified DNA regions guide our genes

#discoveries #biology #genomics #mammals

waynerad@diasp.org

Basal metabolic rate (the rate of energy consumption "at rest" without any physical or mental exertion) has declined in the US since the 1980s. Down 220 kcal/day on average for males and 122 kcal/day for females.

We are currently expending about 220 kcal/d less for males and 122 kcal/d less for females

#discoveries #biology #metabolism

waynerad@diasp.org

New system for recycling wind turbine blades has been invented.

"Turbine blades have previously been challenging to recycle due to the chemical properties of epoxy resin, a resilient substance that was believed to be impossible to break down into re-usable components. This has led to many technology leaders attempting to replace or modify epoxy resin with alternatives that can be more easily treated. Vestas' solution is enabled by a novel chemical process that can chemically break down epoxy resin into virgin-grade materials."

"The newly discovered chemical process shows that epoxy-based turbine blades, whether in operation or sitting in landfill, can be turned into a source of raw material to potentially build new turbine blades. As the chemical process relies on widely available chemicals, it is highly compatible for industrialisation, and can therefore be scaled up quickly."

Apparently the exact chemical process is a secret, though.

Vestas unveils circularity solution to end landfill for turbine blades

#discoveries #chemistry #energy #renewableenergy #recycling