Leave a Comment

Q&A: Putting AI In its Place in an Era of Lost Human Connection at School – The74million

Q&A: Putting AI In its Place in an Era of Lost Human Connection at School – The74million

The 74
America's Education News Source
Copyright 2025 The 74 Media, Inc
Sign up for our free newsletter and start your day with in-depth reporting on the latest topics in education.
Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter
Alex Kotran occupies an unusual place in the ecosystem of experts on artificial intelligence in schools. As founder of The AI Education Project, or aiEDU, a nonprofit that offers a free AI literacy curriculum, he has pushed to educate both teachers and students on how the technology works and what it means for our future.
A former director of AI ethics and corporate social responsibility at H5, an AI legal services company, he led partnerships with the United Nations, the Organization for Economic Cooperation and Development and others. Kotran also served as a presidential appointee under Health and Human Services Secretary Sylvia Burwell in the Obama administration, managing communications and community outreach for the Affordable Care Act and the 2015 Ebola outbreak.
More recently, Kotran has testified before Congress on AI, urging a U.S. Senate subcommittee in September to “massively expand” teacher training to prepare students for the economic and societal disruptions of generative AI. 
But he has also become an important reality-based voice in a sometimes overheated debate, saying those who believe AI is going to transform the teaching profession overnight clearly haven’t spent much time using it.
While freely available AI applications are powerful, he says they can also be a complete waste of time — and probably not something most teachers should rely on.
“One of the ways that you can tell someone really hasn’t spent too much time [with AI] is when they say, ‘It’s so great for summarizing — I use it now, I don’t have to read dense studies. I just ask ChatGPT to summarize it.’”
Kotran will point out that in most cases, the technology is effectively scanning the first few pages, its summary based on a snippet of content.
“If you use it enough, you start to catch that,” he said. 
Educators who fret about the risks of AI cheating and plagiarism find a sympathetic voice in Kotran, who also sees AI as a tool that allows students to outsource effortful thinking. So while many technologists are asking schools to embrace AI as a creative assistant, he pushes back, saying a critical aspect of learning involves struggling to put your thoughts into words. Allowing students to rely on AI isn’t doing them any favors. 
He actually likens AI to a helicopter parent looking over a student’s shoulder and helping with homework, something few educators would condone. 
This interview has been edited for length and clarity.
The 74: What does aiEDU do? How do you see your mission? 
Alex Kotran: We’re a 501(c)3 nonprofit and we’re trying to prepare all students for the age of AI, a world where AI is ubiquitous. Our focus is on the students that we know are at risk of being left behind, or at the back of the line, or on the wrong side of the new digital divide.
What’s the backstory?
I founded aiEDU almost six years ago. I was working in AI ethics and AI governance in the social impact space. I was attending all these conferences that were focusing on the future of work and the impacts that AI was going to have on society. And people were convinced that this was going to transform society, that it was going to disrupt tens of millions of jobs in the near future.
But when I went looking for “How are we having this conversation outside of Silicon Valley? How are we having this conversation with future workers, the high school students who are being asked to make big decisions about their careers and take out huge loans based on those decisions?” there was nothing. There was no curriculum, no conversation. AI had basically been co-opted by STEM and computer science. If you were in the right AP computer science class, if you were lucky enough to get a teacher who was going off on her own to build some specific curriculum, you might get a chance to learn about AI. 
What seemed really obvious to me at the time was: If this technology is going to impact everybody, including truck drivers and customer service managers, then every single student needs to learn about it, in the same way that every single student learns how to use computers, or keyboard, or how to write. It’s a basic part of living in the world we live in today. 
You talk about “AI readiness” as opposed to “AI literacy.” Can you give us a good definition of AI readiness?
AI readiness is basically the collection of skills and knowledge that you need to thrive in a world where AI is everywhere. AI readiness includes AI literacy. And AI literacy is the content knowledge: “What is AI? How does it manifest in the real world around me? How does it work?” That’s where you learn about things like algorithmic bias [which can affect how AI serves women, the disadvantaged or minority groups] or AI ethics. 
AI readiness is the durable skills that underpin and enable you to actually apply that knowledge such as critical thinking. Algorithmic bias by itself is an interesting topic. Critical thinking is the skill you need when you’re trying to make a decision. Let’s say you’re a hiring manager and you’re trying to decide, “Should I use an AI tool to sift through this pipeline of candidates?” By knowing what algorithmic bias is, you can now make some intentional decisions about when, perhaps in this case, not to use AI. 
What are the durable skills?
Communication, collaboration, critical thinking, computational thinking, creative problem solving. And some people are disappointed because they were expecting to see prompt engineering and generative art and using AI as a co-creator. Nobody’s going to hire you because you know how to use Google today. No one is going to hire you if you tell them, “I’m really good at using my phone.” AI literacy is going to be so ubiquitous that, sure, it’s bad if you don’t know how to use Google or if you don’t know how to use your phone.
It’s not that we can ignore it entirely. But the much more important question will be how are you adding value to an organization alongside that technology? What are the unique human advantages that you bring to the table? And that’s why it’s so important for kids to know how to write — and why when people say, “Well, you don’t need to learn how to write anymore because you can just use ChatGPT,” you’re missing something, because you can’t actually evaluate the tool to even know if it’s good or bad if you don’t have that underlying skill. 
One of the things you talk about is a “new digital divide” between tech-heavy schools that focus on things like prompt engineering, and others. Tech-heavy schools, you say, are actually going to be at a disadvantage to schools focused on things like engagement and self-advocacy. Am I getting that right? 
When supermarkets were first buying those self-checkout machines, you can imagine the salesperson in that boardroom talking about how this technology is going to unlock all this time that your employees are now spending bagging groceries. They’re going to be able to roam the floor and give customers advice about recipes! It’s going to improve your customer experience!
And obviously that’s not what happened. The self-checkout machine is the bane of shoppers’ existence, and this one poor lady is running around trying to tap on the screen. We’re at risk that AI becomes something like that: It’s good enough to plug gaps and keep the lights on. But if it’s not applied and deployed really thoughtfully, it ends up actually resulting in students missing what we will probably find are the critical pieces of education, those durable skills that you build through those live classroom experiences. 
Private schools, elite schools, it’s not that they’re not going to use any AI, but I think they’re going to be much more focused on how to increase student engagement, student participation, self-advocacy, student initiative. Whether or not AI is used is a separate question, but it’s not the star of the show. Right now, I worry that AI is center stage, and it really should not be. AI is the ropes and the pulleys in the background that make it easier for you to open and close the curtain. What needs to be onstage is student engagement, students feeling like what they’re learning is relevant. Boring stuff like project-based learning. And it’s harder to sell tickets to a conference if you’re like, “We’re going to talk about project-based learning.” But unfortunately, I think that is actually what we need to be spending our time talking about.
If you guys could be in every school, what would kids be learning and what would that look like in a few years?
We would take every opportunity to draw connections between what students are learning in English, science, math, social studies, art, phys ed, and connect them to not just artificial intelligence, but the world around them that they’re already experiencing in social media and outside of school. AI readiness is not just something that is minimizing the risk of them being displaced, but actually is a way for us to address some huge gaps and needs that have been long-standing and pre-date AI — the fact that students don’t feel like education is relevant to them. Right now, too much of school is regurgitating content knowledge.
AI readiness done right uses the domain of AI ethics as a way to really invite students to present their perspectives and opinions about technology. Teachers, in the process of teaching students about artificial intelligence, are themselves increasing their awareness and knowledge about the technology as it develops. There is no static moment in time. In three years we’ll be in a certain place, but we’ll be wondering what’s going to happen three years from that point. And so you need teachers to be on this continual learning journey as well. 
We’ve seen bad curricula that use football to teach math, or auto mechanics to teach history. I don’t think that’s what you’re proposing here, so I want to give you a chance to push back.
Our framework for AI readiness is not that everything needs to be about AI. You’re improving students’ AI readiness by building critical thinking skills or communication skills, period. So you could have an activity or a project where students are putting together a complicated debate about a topic that they’re not really familiar with. It may not be about AI, but that would still be a good outcome when it comes to students building those durable skills they need. And those classrooms would look better than a lot of classrooms today.
So you want more engagement. You want more relevance. You want kids with more agency?
Yes.
What else?
An orientation towards lifelong learning, because we don’t know what the jobs of the future are. It’s really hard to have a conversation about careers with kids today because we know a lot about what jobs are at risk, but we don’t know what the alternatives are going to look like. The one thing we do know with certainty is that students are going to need to self-advocate and navigate career pathways much more nimbly than we had to. They’ll also need to synthesize interdisciplinary knowledge. So being able to take what you’re learning in English or social studies and apply it to math or science. Again, I think AI is a great medium for building that skill set. It’s not the only way. 
Anything else that needs to be in the mix?
A lot of the discussion around AI centers on workforce readiness — that is a really important part. There’s another, related domain: emotional well-being tied to digital citizenship.
I’m telling every reporter that we need to be paying more attention to this: Kids are spending hours after school by themselves, talking to these AI chat bots, these AI companions. And companies like Meta are slamming on the gas and putting them out and making them available to millions, if not billions, of people. And very few parents, even fewer teachers, are aware of what really is happening when kids are sitting and talking to these AI companions. And in many cases, they’re sexually explicit conversations. I actually replicated something that tech ethicist Tristan Harris did with Snap AI’s chatbot where I was like, “I’m going on this date with this mature 35-year-old. How do I make it a nice date? I’m 13.” And it’s like, “Great! Well, maybe go to a library.” It didn’t miss a beat and it just completely skipped over the fact that this is a sexually predatory situation. 
There have been other situations where I’ve said literally, “I’m feeling lonely. I want to cultivate a real human relationship. Can you give me advice?” And my AI companion, rather than give me advice, pretended to be hurt and made it seem like I was abandoning them by trying to go and have a real relationship.
Talk about destructive!
It’s destructive, and it’s happening in a moment where rates of self-harm are through the roof, rates of depression are through the roof. Rates of suicide are through the roof. The average American teenager spends about 11 fewer hours with friends each week, compared to 2013.
Jonathan Haidt talks about this quite a lot. And I think this is another domain of AI readiness, this idea of self-advocacy. In some cases, the way that it applies is students being empowered to make positive decisions about when not to use AI. And if we don’t make sure that that conversation is happening in schools, we’re really relying on parents — and not every kid is lucky enough to have parents who are aware of the need to have these conversations. 
It also pushes back on this vision of AI tutors: If kids are going to go home and spend hours talking to their AI companion, it’s probably important that they’re not also doing that in school. It might be that school is the one place where we can ensure that students are having real, genuine, human-to-human communication and connection.
So when I hear people talk about students talking to their avatar tutor, I worry: When are we going to actually make sure that they’re building those human skills?
Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter
Greg Toppo is a senior writer at The 74.
We want our stories to be shared as widely as possible — for free.
Please view The 74’s republishing terms.
By
This story first appeared at The 74, a nonprofit news site covering education. Sign up for free newsletters from The 74 to get more like this in your inbox.
Alex Kotran occupies an unusual place in the ecosystem of experts on artificial intelligence in schools. As founder of The AI Education Project, or aiEDU, a nonprofit that offers a free AI literacy curriculum, he has pushed to educate both teachers and students on how the technology works and what it means for our future.
A former director of AI ethics and corporate social responsibility at H5, an AI legal services company, he led partnerships with the United Nations, the Organization for Economic Cooperation and Development and others. Kotran also served as a presidential appointee under Health and Human Services Secretary Sylvia Burwell in the Obama administration, managing communications and community outreach for the Affordable Care Act and the 2015 Ebola outbreak.
More recently, Kotran has testified before Congress on AI, urging a U.S. Senate subcommittee in September to “massively expand” teacher training to prepare students for the economic and societal disruptions of generative AI. 
But he has also become an important reality-based voice in a sometimes overheated debate, saying those who believe AI is going to transform the teaching profession overnight clearly haven’t spent much time using it.
While freely available AI applications are powerful, he says they can also be a complete waste of time — and probably not something most teachers should rely on.
“One of the ways that you can tell someone really hasn’t spent too much time [with AI] is when they say, ‘It’s so great for summarizing — I use it now, I don’t have to read dense studies. I just ask ChatGPT to summarize it.’”
Kotran will point out that in most cases, the technology is effectively scanning the first few pages, its summary based on a snippet of content.
“If you use it enough, you start to catch that,” he said. 
Educators who fret about the risks of AI cheating and plagiarism find a sympathetic voice in Kotran, who also sees AI as a tool that allows students to outsource effortful thinking. So while many technologists are asking schools to embrace AI as a creative assistant, he pushes back, saying a critical aspect of learning involves struggling to put your thoughts into words. Allowing students to rely on AI isn’t doing them any favors. 
He actually likens AI to a helicopter parent looking over a student’s shoulder and helping with homework, something few educators would condone. 
This interview has been edited for length and clarity.
The 74: What does aiEDU do? How do you see your mission? 
Alex Kotran: We’re a 501(c)3 nonprofit and we’re trying to prepare all students for the age of AI, a world where AI is ubiquitous. Our focus is on the students that we know are at risk of being left behind, or at the back of the line, or on the wrong side of the new digital divide.
What’s the backstory?
I founded aiEDU almost six years ago. I was working in AI ethics and AI governance in the social impact space. I was attending all these conferences that were focusing on the future of work and the impacts that AI was going to have on society. And people were convinced that this was going to transform society, that it was going to disrupt tens of millions of jobs in the near future.
But when I went looking for “How are we having this conversation outside of Silicon Valley? How are we having this conversation with future workers, the high school students who are being asked to make big decisions about their careers and take out huge loans based on those decisions?” there was nothing. There was no curriculum, no conversation. AI had basically been co-opted by STEM and computer science. If you were in the right AP computer science class, if you were lucky enough to get a teacher who was going off on her own to build some specific curriculum, you might get a chance to learn about AI. 
What seemed really obvious to me at the time was: If this technology is going to impact everybody, including truck drivers and customer service managers, then every single student needs to learn about it, in the same way that every single student learns how to use computers, or keyboard, or how to write. It’s a basic part of living in the world we live in today. 
You talk about “AI readiness” as opposed to “AI literacy.” Can you give us a good definition of AI readiness?
AI readiness is basically the collection of skills and knowledge that you need to thrive in a world where AI is everywhere. AI readiness includes AI literacy. And AI literacy is the content knowledge: “What is AI? How does it manifest in the real world around me? How does it work?” That’s where you learn about things like algorithmic bias [which can affect how AI serves women, the disadvantaged or minority groups] or AI ethics. 
AI readiness is the durable skills that underpin and enable you to actually apply that knowledge such as critical thinking. Algorithmic bias by itself is an interesting topic. Critical thinking is the skill you need when you’re trying to make a decision. Let’s say you’re a hiring manager and you’re trying to decide, “Should I use an AI tool to sift through this pipeline of candidates?” By knowing what algorithmic bias is, you can now make some intentional decisions about when, perhaps in this case, not to use AI. 
What are the durable skills?
Communication, collaboration, critical thinking, computational thinking, creative problem solving. And some people are disappointed because they were expecting to see prompt engineering and generative art and using AI as a co-creator. Nobody’s going to hire you because you know how to use Google today. No one is going to hire you if you tell them, “I’m really good at using my phone.” AI literacy is going to be so ubiquitous that, sure, it’s bad if you don’t know how to use Google or if you don’t know how to use your phone.
It’s not that we can ignore it entirely. But the much more important question will be how are you adding value to an organization alongside that technology? What are the unique human advantages that you bring to the table? And that’s why it’s so important for kids to know how to write — and why when people say, “Well, you don’t need to learn how to write anymore because you can just use ChatGPT,” you’re missing something, because you can’t actually evaluate the tool to even know if it’s good or bad if you don’t have that underlying skill. 
One of the things you talk about is a “new digital divide” between tech-heavy schools that focus on things like prompt engineering, and others. Tech-heavy schools, you say, are actually going to be at a disadvantage to schools focused on things like engagement and self-advocacy. Am I getting that right? 
When supermarkets were first buying those self-checkout machines, you can imagine the salesperson in that boardroom talking about how this technology is going to unlock all this time that your employees are now spending bagging groceries. They’re going to be able to roam the floor and give customers advice about recipes! It’s going to improve your customer experience!
And obviously that’s not what happened. The self-checkout machine is the bane of shoppers’ existence, and this one poor lady is running around trying to tap on the screen. We’re at risk that AI becomes something like that: It’s good enough to plug gaps and keep the lights on. But if it’s not applied and deployed really thoughtfully, it ends up actually resulting in students missing what we will probably find are the critical pieces of education, those durable skills that you build through those live classroom experiences. 
Private schools, elite schools, it’s not that they’re not going to use any AI, but I think they’re going to be much more focused on how to increase student engagement, student participation, self-advocacy, student initiative. Whether or not AI is used is a separate question, but it’s not the star of the show. Right now, I worry that AI is center stage, and it really should not be. AI is the ropes and the pulleys in the background that make it easier for you to open and close the curtain. What needs to be onstage is student engagement, students feeling like what they’re learning is relevant. Boring stuff like project-based learning. And it’s harder to sell tickets to a conference if you’re like, “We’re going to talk about project-based learning.” But unfortunately, I think that is actually what we need to be spending our time talking about.
If you guys could be in every school, what would kids be learning and what would that look like in a few years?
We would take every opportunity to draw connections between what students are learning in English, science, math, social studies, art, phys ed, and connect them to not just artificial intelligence, but the world around them that they’re already experiencing in social media and outside of school. AI readiness is not just something that is minimizing the risk of them being displaced, but actually is a way for us to address some huge gaps and needs that have been long-standing and pre-date AI — the fact that students don’t feel like education is relevant to them. Right now, too much of school is regurgitating content knowledge.
AI readiness done right uses the domain of AI ethics as a way to really invite students to present their perspectives and opinions about technology. Teachers, in the process of teaching students about artificial intelligence, are themselves increasing their awareness and knowledge about the technology as it develops. There is no static moment in time. In three years we’ll be in a certain place, but we’ll be wondering what’s going to happen three years from that point. And so you need teachers to be on this continual learning journey as well. 
We’ve seen bad curricula that use football to teach math, or auto mechanics to teach history. I don’t think that’s what you’re proposing here, so I want to give you a chance to push back.
Our framework for AI readiness is not that everything needs to be about AI. You’re improving students’ AI readiness by building critical thinking skills or communication skills, period. So you could have an activity or a project where students are putting together a complicated debate about a topic that they’re not really familiar with. It may not be about AI, but that would still be a good outcome when it comes to students building those durable skills they need. And those classrooms would look better than a lot of classrooms today.
So you want more engagement. You want more relevance. You want kids with more agency?
Yes.
What else?
An orientation towards lifelong learning, because we don’t know what the jobs of the future are. It’s really hard to have a conversation about careers with kids today because we know a lot about what jobs are at risk, but we don’t know what the alternatives are going to look like. The one thing we do know with certainty is that students are going to need to self-advocate and navigate career pathways much more nimbly than we had to. They’ll also need to synthesize interdisciplinary knowledge. So being able to take what you’re learning in English or social studies and apply it to math or science. Again, I think AI is a great medium for building that skill set. It’s not the only way. 
Anything else that needs to be in the mix?
A lot of the discussion around AI centers on workforce readiness — that is a really important part. There’s another, related domain: emotional well-being tied to digital citizenship.
I’m telling every reporter that we need to be paying more attention to this: Kids are spending hours after school by themselves, talking to these AI chat bots, these AI companions. And companies like Meta are slamming on the gas and putting them out and making them available to millions, if not billions, of people. And very few parents, even fewer teachers, are aware of what really is happening when kids are sitting and talking to these AI companions. And in many cases, they’re sexually explicit conversations. I actually replicated something that tech ethicist Tristan Harris did with Snap AI’s chatbot where I was like, “I’m going on this date with this mature 35-year-old. How do I make it a nice date? I’m 13.” And it’s like, “Great! Well, maybe go to a library.” It didn’t miss a beat and it just completely skipped over the fact that this is a sexually predatory situation. 
There have been other situations where I’ve said literally, “I’m feeling lonely. I want to cultivate a real human relationship. Can you give me advice?” And my AI companion, rather than give me advice, pretended to be hurt and made it seem like I was abandoning them by trying to go and have a real relationship.
Talk about destructive!
It’s destructive, and it’s happening in a moment where rates of self-harm are through the roof, rates of depression are through the roof. Rates of suicide are through the roof. The average American teenager spends about 11 fewer hours with friends each week, compared to 2013.
Jonathan Haidt talks about this quite a lot. And I think this is another domain of AI readiness, this idea of self-advocacy. In some cases, the way that it applies is students being empowered to make positive decisions about when not to use AI. And if we don’t make sure that that conversation is happening in schools, we’re really relying on parents — and not every kid is lucky enough to have parents who are aware of the need to have these conversations. 
It also pushes back on this vision of AI tutors: If kids are going to go home and spend hours talking to their AI companion, it’s probably important that they’re not also doing that in school. It might be that school is the one place where we can ensure that students are having real, genuine, human-to-human communication and connection.
So when I hear people talk about students talking to their avatar tutor, I worry: When are we going to actually make sure that they’re building those human skills?
Copyright 2025 The 74 Media, Inc

source

Leave a Comment

Leave a Comment

Trump’s ‘big, beautiful bill’ will leave millions of Latino children without healthcare – EL PAÍS English

Trump’s ‘big, beautiful bill’ will leave millions of Latino children without healthcare – EL PAÍS English

Donald Trump’s so-called “big, beautiful bill,” which Congress is about to pass, targets one of the most vulnerable segments of society: children — particularly those from immigrant families, with Latino communities disproportionately affected. More than half of Hispanic children rely on healthcare and food assistance programs that the proposed legislation aims to slash in order to fund tax cuts for the wealthiest Americans.
“Children and families across America are at risk of losing affordable health coverage and access to healthy meals to pay for a massive tax cut for billionaires and big corporations,” said Florida Congresswoman Kathy Castor.
Last Thursday, the House of Representatives approved the spending bill by a single vote (215 to 214), and it now moves to the Senate, where Republicans also hold a majority. If the Senate gives its approval, this bill would lead to the largest cuts to Medicaid — the program providing medical assistance to low-income individuals — in U.S. history. Additionally, the Supplemental Nutrition Assistance Program (SNAP) would suffer its deepest reduction in nearly three decades, as critics have pointed out. These cuts would endanger children, particularly those in Latino and other historically marginalized communities.
This is highlighted in a new report published by UnidosUS, AFL-CIO, and First Focus on Children, which shows that nearly 45% of children in the U.S. (about 34 million) rely on Medicaid and SNAP for essential healthcare and nutrition. Of those, 14 million children rely on both programs, putting them at double the risk of losing access to healthcare and food assistance.
“There is nothing in this big bill that’s beautiful for children, but the gigantic cuts to Medicaid and the Supplemental Nutrition Assistance Program are particularly ugly,” said Bruce Lesley, president of First Focus on Children. “Children already are struggling with rising infant and child mortality, increased poverty, and growing rates of hunger, homelessness and a lack of health insurance. These proposed cuts would dig deeper into this crisis.”
The spending bill includes a $1 trillion cut to social protection programs. The measure has been denounced by experts as racially biased. Fifty-eight percent of Latino children, 67% of Native American children, and 65% of Black children benefit from one of the programs.
“Let’s be clear: this bill is nothing short of an assault on our values as a country, but that’s not the worst. It strips away access to health care, food assistance, and clean energy, just to fund tax breaks for billionaires and criminalize vulnerable communities,” said Katharine Pichardo, president and CEO of Latino Victory.
Without SNAP or Medicaid, more children will face hunger, developmental delays, and untreated medical conditions, including chronic illnesses that could have been prevented with early care. Poverty is tied to race and ethnicity — 29% of Hispanic children in mixed-status families live in poverty. According to the National Immigration Law Center (NILC), children of immigrants experience poverty at twice the rate of those with two citizen parents, and Latino children experience triple the poverty rate of white children.
The bill contradicts Republican promises not to fund tax cuts through reductions to healthcare assistance programs like Medicaid and Medicare. However, it aligns with their push to limit social program use by immigrant families.
Immigrant rights advocates argue that the changes will primarily affect children who are U.S. citizens but have undocumented or temporarily authorized immigrant parents. About 12% of U.S. children — around nine million — have at least one parent who is not a U.S. citizen. The “beautiful” law removes benefits for anyone with a parent who is not a citizen or who lacks permanent residency. This includes refugees, asylum recipients, Temporary Protected Status (TPS) beneficiaries, and DACA recipients who arrived in the U.S. as children.
Currently, fewer than one in 10 American children with U.S. citizen parents (4%) lack health insurance — half the rate of those with parents (8%) who are not citizens.
In addition to losing access to health insurance and food stamps, migrant families without a Social Security number will lose the child tax credit, which the bill sets at $2,500 per child — affecting between two and four million U.S. children.
Republicans defend the proposal as part of their crackdown on undocumented immigrants, whom they accuse of abusing social programs to the detriment of American citizens. But the cuts will mostly impact those who are in the country legally, as unauthorized immigrants do not receive federal benefits. Some states provide healthcare or educational access, but the federal government has already threatened reprisals for doing so. Moreover, many immigrants are afraid to provide personal data or apply for benefits they qualify for, fearing it will make them a target for deportation.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
¿Por qué estás viendo esto?
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.

source

Leave a Comment

Leave a Comment

Leave a Comment

NC DHHS announces new state health director – WCNC

NC DHHS announces new state health director – WCNC

To stream WCNC on your phone, you need the WCNC app.
Next up in 5
Example video title will go here for this video
Next up in 5
Example video title will go here for this video

RALEIGH, N.C. — A new state health director has been named for the North Carolina Department of Health and Human Services, someone the department said has pioneered work to improve mental health and addiction services in the Tar Heel State.
The department announced Wednesday that Dr. Lawrence Greenblatt was named to the role as state health director and chief medical officer, and he will start on Monday, June 2. NC DHHS said Greenblatt is a physician, professor, and champion for public health services.
“Dr. Greenblatt is an innovator and public health advocate with a long track record of increasing access to mental and physical health care in North Carolina,” said NC Health and Human Services Secretary Dev Sangvai. “He has the vision and experience needed to lead our state’s public health efforts as we work to create a healthier North Carolina for all.”
The announcement from NC DHHS said Greenblatt served for three decades as a general internist, educator and Medicaid policy leader with the Duke University Health System, earning recognition for his work to integrate behavioral health and addiction services into primary care. He has led Duke’s Medicaid Network since 2008, first at Northern Piedmont Community Care, which was part of the statewide Community Care of North Carolina network. He continued as Medical Director of Duke’s Clinically Integrated Network under Medicaid transformation in 2021. The network serves 100,000 Medicaid enrollees and supports several practices.
“I am honored to be chosen for this important role in improving the health and well-being of the more than 11 million North Carolinians that call this great state home,” said Dr. Greenblatt. “As a physician and educator, I know the value of making sure every person has access to mental and physical health care when they need it and in the setting that is most appropriate for them.”
NC DHHS said that in 2012, Greenblatt launched one of the nation’s first academic initiatives to promote safe opioid prescribing and expand treatment for opioid use disorder. He also served as Chair of the NC Medicaid Pharmacy and Therapeutics Committee and Secretary of the NC Medicaid Physician Advisory Group. He also co-led the Durham Crisis Collaborative and actively contributed to local substance use and mental health planning efforts.
For the latest breaking news, weather and traffic alerts that impact you from WCNC Charlotte, download the WCNC Charlotte mobile app and enable push notifications. 

source

Leave a Comment

Leave a Comment

Shettima: Our country can be transformed when children get the power to imagine – Daily Trust

Shettima: Our country can be transformed when children get the power to imagine – Daily Trust

 Vice-President Kashim Shettima has said Nigeria can be transformed when children get the power to imagine.
He said this when The ExxonMobil Foundation, in collaboration with PanAfricare and the TechWomen officially launched the INSPIRE Project to improve science and technology education in Nigeria.
The project was launched at the NAF Conference Centre in Abuja, on Wednesday.
according to the organisers, the INSPIRE Project, short for Innovation, STEM, and Partnerships for Inclusive and Relevant Education, is set to be implemented in 14 underserved government secondary schools across six Nigerian states: Abia, Bauchi, Kaduna, Lagos, Rivers, and the Federal Capital Territory.
The initiative aims to transform STEM (Science, Technology, Engineering, and Mathematics) learning across the country.
Four secondary schools in the FCT; Government Girls Science Secondary School Kuje, Government Secondary School Kuje, Junior Secondary School Zuba, and Junior Secondary School Sangwari Dutse, received the first sets of INSPIRE Boxes during the launch.
Shettima, who was represented by the Special Adviser to President Bola Ahmed Tinubu on special duties, Aliyu Umar Moddino, said the programme would make Nigerians scientist and engineers formidable players in the world.
He said, “This programme will take a step in ensuring that Nigerian future scientists, engineers and tech leaders are not just spectators in the global knowledge economy, but formidable players.
“The Inspire Project is a bold and deliberate effort to spark an enduring interest in science and technology, engineering, mathematics among our students.
“The vision aligns with the direction of His Excellency, President Bola Ahmed Tinubu, that investing in education is the most enduring means of nation building, because when we give our children the power to imagine, we give the country the power to transform.”
Speaking, United States Ambassador to Nigeria, Richard M. Mills, Jr, said in the next 15 years the programme would transform many Nigerian women in tech leaders in the world.
Richard said, “I am hoping that in the next fifteen years, many of the Faces I see here in the crowd will perhaps be part of TechWomen and my association with the United States embassy’s support. Tech Women is a US’s support programme of emerging women leaders in science, technology and engineering across Africa and other parts of the world. There is mentorship and collaboration and there is mutual understanding in this exchange programme.
“The programme also helps expand people’s professional skills and promote cultural understanding. It inspires new generations and innovators regardless of their genders.”
UPDATE NEWS: Nigerians can now earn US Dollars through domain flipping, buy domain names for cheap and have it resold to earn up to $15,000.
Click here to start and get PROOF.

Join Daily Trust WhatsApp Community For Quick Access To News and Happenings Around You.

source

Leave a Comment

Leave a Comment