Meta
‘Sorry for everything you’ve all been through,’ Zuckerberg says to parents of child victims
Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children's advocates and lawmakers say companies are not doing enough to protect them.
On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify at a time when lawmakers and parents are growing increasingly concerned about the effects of social media on young people’s lives.
The hearing began with recorded testimony from kids and parents who said they or their children were exploited on social media. Throughout the hourslong event, parents who lost children to suicide silently held up pictures of their dead kids.
"They’re responsible for many of the dangers our children face online,” Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”
Meta's initial decisions to remove 2 videos of Israel-Hamas war is reversed by Oversight Board
In a heated question and answer session with Mark Zuckerberg, Republican Missouri Sen. Josh Hawley asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.
“I don't think so,” Zuckerberg replied.
“There's families of victims here,” Hawley said. “Would you like to apologize to them?”
Zuckerberg stood, turned away from his microphone and the senators, and directly addressed the parents in the gallery.
“I’m sorry for everything you have all been through. No one should go through the things that your families have suffered,” he said, adding that Meta continues to invest and work on “industrywide efforts” to protect children.
But time and time again, children’s advocates and parents have stressed that none of the companies are doing enough.
One of the parents who attended the hearing was Neveen Radwan, whose teenage daughter got sucked in to a “black hole of dangerous content” on TikTok and Instagram after she started looking at videos on healthy eating and exercise at the onset of the COVID lockdowns. She developed anorexia within a few months and nearly died, Radwan recalled.
“Nothing that was said today was different than what we expected,” Radwan said. “It was a lot of promises and a lot of, quite honestly, a lot of talk without them really saying anything. The apology that he made, while it was appreciated, it was a little bit too little, too late, of course.”
But Radwan, whose daughter is now 19 and in college, said she felt a “significant shift” in the energy as she sat through the hearing, listening to the senators grill the social media CEOs in tense exchanges.
“The energy in the room was, very, very palpable. Just by our presence there, I think it was very noticeable how our presence was affecting the senators,” she said.
Hawley continued to press Zuckerberg, asking if he'd take personal responsibility for the harms his company has caused. Zuckerberg stayed on message and repeated that Meta's job is to “build industry-leading tools” and empower parents.
“To make money,” Hawley cut in.
Israeli defense minister says war on Hamas will last months as US envoy discusses timetable
South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin's sentiments and said he's prepared to work with Democrats to solve the issue.
“After years of working on this issue with you and others, I’ve come to conclude the following: Social media companies as they’re currently designed and operate are dangerous products," Graham said.
The executives touted existing safety tools on their platforms and the work they’ve done with nonprofits and law enforcement to protect minors.
Snapchat broke ranks ahead of the hearing and is backing a federal bill that would create a legal liability for apps and social platforms that recommend harmful content to minors. Snap CEO Evan Spiegel reiterated the company’s support on Wednesday and asked the industry to back the bill.
TikTok CEO Shou Zi Chew said the company is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn’t cater to children.
“We do not have a line of business dedicated to children,” Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that makes it easier for victims of child exploitation to sue tech companies.
Yet child health advocates say social media companies have failed repeatedly to protect minors.
Profits should not be the primary concern when companies are faced with safety and privacy decisions, said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. “These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.”
Republican and Democratic senators came together in a rare show of agreement throughout the hearing, though it’s not yet clear if this will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sens. Richard Blumenthal of Connecticut and Marsha Blackburn of Tennessee.
“There is pretty clearly a bipartisan consensus that the status quo isn’t working," said New Mexico Attorney General Raúl Torrez, a Democrat. “When it comes to how these companies have failed to prioritize the safety of children, there’s clearly a sense of frustration on both sides of the aisle.”
Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms. New Mexico filed a separate lawsuit saying the company has failed to protect them from online predators.
New internal emails between Meta executives released by Blumenthal’s office show Nick Clegg, the company's president of global affairs, and others asking Zuckerberg to hire more people to strengthen "wellbeing across the company” as concerns grew about effects on youth mental health.
“From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Clegg wrote in an August 2021 email.
The emails released by Blumenthal’s office don’t appear to include a response, if there was any, from Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate. Clegg followed up on the August email in November with a scaled-down proposal but it does not appear that anything was approved.
“I’ve spoken to many of the parents at the hearing. The harm their children experienced, all that loss of innocent life, is eminently preventable. When Mark says ‘Our job is building the best tools we can,’ that is just not true,” said Arturo Béjar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta’s platforms. “They know how much harm teens are experiencing, yet they won’t commit to reducing it, and most importantly to be transparent about it. They have the infrastructure to do it, the research, the people, it is a matter of prioritization.”
Béjar said the emails and Zuckerberg's testimony show that Meta and its CEO “do not care about the harm teens experience” on their platforms.
“Nick Clegg writes about profound gaps with addiction, self-harm, bullying and harassment to Mark. Mark did not respond, and those gaps are unaddressed today. Clegg asked for 84 engineers of 30,000,” Béjar said. “Children are not his priority.”
Read more: Facebook parent Meta hit with record fine for transferring European user data to US
10 months ago
Facebook parent Meta hit with record fine for transferring European user data to US
The European Union slapped Meta with a record $1.3 billion privacy fine Monday (May 22, 2023) and ordered it to stop transferring user data across the Atlantic by October, the latest salvo in a decadelong case sparked by U.S. cybersnooping fears.
The penalty fine of 1.2 billion euros from Ireland's Data Protection Commission is the biggest since the EU's strict data privacy regime took effect five years ago, surpassing Amazon's 746 million euro penalty in 2021 for data protection violations.
The Irish watchdog is Meta's lead privacy regulator in the 27-nation bloc because the Silicon Valley tech giant's European headquarters is based in Dublin.
Meta, which had previously warned that services for its users in Europe could be cut off, vowed to appeal and ask courts to immediately put the decision on hold.
Read more: Facebook user data issue: Facebook parent company Meta will pay $725M
“There is no immediate disruption to Facebook in Europe,” the company said.
“This decision is flawed, unjustified and sets a dangerous precedent for the countless other companies transferring data between the EU and U.S.,” Nick Clegg, Meta's president of global and affairs, and Chief Legal Officer Jennifer Newstead said in a statement.
It's yet another twist in a legal battle that began in 2013 when Austrian lawyer and privacy activist Max Schrems filed a complaint about Facebook’s handling of his data following former National Security Agency contractor Edward Snowden’s revelations about U.S. cybersnooping.
The saga has highlighted the clash between Washington and Brussels over the differences between Europe's strict view on data privacy and the comparatively lax regime in the U.S., which lacks a federal privacy law.
Read more: Meta oversight board urges changes to VIP moderation system
An agreement covering EU-U.S. data transfers known as the Privacy Shield was struck down in 2020 by the EU's top court, which said it didn’t do enough to protect residents from the U.S. government's electronic prying.
That left another tool to govern data transfers — stock legal contracts. Irish regulators initially ruled that Meta didn't need to be fined because it was acting in good faith in using them to move data across the Atlantic. But it was overruled by the EU's top panel of data privacy authorities last month, a decision that the Irish watchdog confirmed Monday.
Meanwhile, Brussels and Washington signed an agreement last year on a reworked Privacy Shield that Meta could use, but the pact is awaiting a decision from European officials on whether it adequately protects data privacy.
EU institutions have been reviewing the agreement, and the bloc's lawmakers this month called for improvements, saying the safeguards aren't strong enough.
Read more: Meta contributes over Tk1.5 crore for Sitrang-hit people's rehabilitation efforts
Meta warned in its latest earnings report that without a legal basis for data transfers, it will be forced to stop offering its products and services in Europe, “which would materially and adversely affect our business, financial condition, and results of operations.”
The social media company might have to carry out a costly and complex revamp of its operations if it's forced to stop shipping user data across the Atlantic. Meta has a fleet of 21 data centers, according to its website, but 17 of them are in the United States. Three others are in the European nations of Denmark, Ireland and Sweden. Another is in Singapore.
Other social media giants are facing pressure over their data practices. TikTok has tried to soothe Western fears about the Chinese-owned short video sharing app's potential cybersecurity risks with a $1.5 billion project to store U.S. user data on Oracle servers.
Read more: Ohio retirement fund sues Facebook over investment loss
1 year ago
Encrypted video calls with up to 8, audio calls with up to 32 people on WhatsApp: Zuckerberg
Meta CEO Mark Zuckerberg today (March 23, 2023) announced the launching of a new WhatsApp desktop app for Windows.
“Now you can make E2E encrypted video calls with up to 8 people and audio calls with up to 32 people,” Zuckerberg posted on his verified Facebook profile.
In a recent blog post, Meta said: “We’ll continue to increase these limits over time so you can always stay connected with friends, family, and coworkers.”
The tech giant also announced that its instant messaging app for Windows has got a revamped look with new features.
Read More: Meta slashes another 10,000 jobs
“The new WhatsApp app for Windows will load faster and is built with an interface similar to the mobile version of the app,” the blog post reads.
“We’ve made improvements to device linking and better syncing across multiple devices,” Meta said.
To avail the new features, the users have to install the latest update of the WhatsApp Windows desktop app. Once updated, A call option in the chat box – similar to the call icon available in WhatsApp app on Android or iOS – will be visible to the users
Meta also announced the new Mac desktop version of the app, which is currently in beta testing.
Read More: WhatsApp Communities: Here’s what the latest feature offers
Prioritizing the privacy of the users in mind, WhatsApp has rolled out a new feature that gives group admins more control over their group privacy.
“As more people join communities, we want to give group admins more control over their group privacy, so we’ve built a simple tool that gives admins the ability to decide who is able to join a group,” Meta said in another blog post.
1 year ago
Meta slashes another 10,000 jobs
Facebook parent Meta is slashing another 10,000 jobs and will not fill 5,000 open positions as the social media pioneer cuts costs.
The company announced 11,000 job cuts in November, about 13% of its workforce at the time.
Meta and other tech companies have been hiring aggressively for at least two years and in recent months have begun to let some of those workers go.
Early last month, Meta posted falling profits and its third consecutive quarter of declining revenue.
The company said Tuesday it will reduce the size of its recruiting team and make further cuts in its tech groups in late April, and then its business groups in late May.
“This will be tough and there’s no way around that,” said CEO Mark Zuckerberg. “It will mean saying goodbye to talented and passionate colleagues who have been part of our success.”
The Menlo Park, California, company has invested billions of dollars to realign its focus on the metaverse. In February it said a downturn in online advertising and competition from rivals such as TikTok weighed on results.
“As I’ve talked about efficiency this year, I’ve said that part of our work will involve removing jobs -- and that will be in service of both building a leaner, more technical company and improving our business performance to enable our long term vision,” said Zuckerberg.
The biggest tech companies in the U.S. are cutting costs elsewhere, too.
This month, Amazon paused construction on its second headquarters in Virginia following the biggest round of layoffs in the company’s history and its shifting plans around remote work.
In early trading, Meta shares rose 6%.
1 year ago
Facebook user data issue: Facebook parent company Meta will pay $725M
Facebook’s corporate parent has agreed to pay $725 million to settle a lawsuit alleging the world’s largest social media platform allowed millions of its users’ personal information to be fed to Cambridge Analytica, a firm that supported Donald Trump’s victorious presidential campaign in 2016.
Terms of the settlement reached by Meta Platforms, the holding company for Facebook and Instagram, were disclosed in court documents filed late Thursday. It will still need to be approved by a judge in a San Francisco federal court hearing set for March.
The case sprang from 2018 revelations that Cambridge Analytica, a firm with ties to Trump political strategist Steve Bannon, had paid a Facebook app developer for access to the personal information of about 87 million users of the platform. That data was then used to target U.S. voters during the 2016 campaign that culminated in Trump’s election as the 45th president.
Uproar over the revelations led to a contrite Zuckerberg being grilled by U.S. lawmakers during a high-profile congressional hearing and spurred calls for people to delete their Facebook accounts. Even though Facebook’s growth has stalled as more people connect and entertain themselves on rival services such as TikTok, the social network still boasts about 2 billion users worldwide, including nearly 200 million in the U.S. and Canada.
Also read: Meta brings Facebook Reels to Bangladesh
The lawsuit, which had been seeking to be certified as a class action representing Facebook users, had asserted the privacy breach proved Facebook is a “data broker and surveillance firm,” as well as a social network.
The two sides reached a temporary settlement agreement in August, just a few weeks before a Sept. 20 deadline for Meta CEO Mark Zuckerberg and his long-time chief operating officer, Sheryl Sandberg, to submit to depositions.
The company based in Menlo Park, California, said in statement Friday it pursued a settlement because it was in the best interest of its community and shareholders.
“Over the last three years we revamped our approach to privacy and implemented a comprehensive privacy program," said spokesperson Dina El-Kassaby Luce. “We look forward to continuing to build services people love and trust with privacy at the forefront.”
1 year ago
Meta oversight board urges changes to VIP moderation system
Facebook parent Meta’s quasi-independent oversight board said Tuesday that an internal system that exempted high-profile users, including former U.S. President Donald Trump, from some or all of its content moderation rules needs a major overhaul.
The report by the Oversight Board, which was more than a year in the making, said the system “is flawed in key areas which the company must address.”
Meta asked the board to look into the system after The Wall Street Journal reported last year that it was being abused by many of its elite users, who were posting material that would result in penalties for ordinary people, including for harassment and incitement of violence.
Facebook’s rules reportedly didn’t seem to apply to some VIP users while others faced reviews of rule-breaking posts that never happened, according to the Journal article, which said the system had at least 5.8 million exempted users as of 2020.
The system — known as “XCheck,” or cross-check — was exposed in Facebook documents leaked by Frances Haugen, a former product manager turned whistleblower who captured worldwide headlines with revelations alleging that the social media company prioritized profits over online safety and galvanized regulators into cracking down on hate speech and misinformation.
Nick Clegg, Meta’s president for global affairs, tweeted that the company requested the review of the system “so that we can continue our work to improve the program.”
To fully address the board’s recommendations, “we’ve agreed to respond within 90 days,” he added.
The company has said cross-check, which applies to Facebook and Instagram, was designed to prevent “overpolicing,” or mistakenly removing content thought to be breaking the platform’s rules.
The Oversight Board’s report said that the cross-check system resulted in users being treated unequally and that it led to delays in taking down content that violated the rules because there were up to five separate checks. Decisions on average took more than five days, it found.
Read more: Meta brings Facebook Reels to Bangladesh
For content posted by American users, the average decision took 12 days, and for Afghanistan and Syria, it was 17 days. In some cases, it took a lot longer: one piece of content waited 222 days — more than seven months — for a decision, the report said, without providing further details.
Among its 32 recommendations, the board said Meta “should prioritize expression that is important for human rights, including expression which is of special public importance.”
Human rights defenders, advocates for marginalized communities, public officials and journalists should be given higher priority than others put on the cross-check list because they are business partners, such as big companies, political parties, musicians, celebrities and artists, the report said.
“If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection,” the board said.
Addressing other flaws, the board also urged Meta to remove or hide content while it’s being reviewed and said the company should “radically increase transparency around cross-check and how it operates,” such as outlining “clear, public criteria” on who gets to be on the list.
The board upheld Facebook’s decision to ban Trump last year out of concern he incited violence leading to the riot on the U.S. Capitol. But it said the company failed to mention the cross-check system in its request for a ruling. The company has until Jan. 7 to decide whether to let Trump back on.
Clegg said in a blog post that Meta has already been making changes to cross-check, including standardizing it so that it’s “run in a more consistent way,” opening up the system to content from all 3 billion Facebook users and holding annual reviews to verify its list of elite users and entities.
After widespread criticism that it failed to respond swiftly and effectively to misinformation, hate speech and harmful influence campaigns, Facebook set up the oversight panel as the ultimate referee of thorny content issues it faces. Members include a former Danish prime minister, the former editor-in-chief of British newspaper the Guardian, as well as legal scholars and human rights experts.
The board upheld Facebook’s decision to ban Trump last year out of concern he incited violence leading to the riot on the U.S. Capitol. But it said the company failed to mention the cross-check system in its request for a ruling. The company has until Jan. 7 to decide whether to let Trump back on.
Clegg said in a blog post that Meta has already been making changes to cross-check, including standardizing it so that it’s “run in a more consistent way,” opening up the system to content from all 3 billion Facebook users and holding annual reviews to verify its list of elite users and entities.
Read more: Meta contributes over Tk1.5 crore for Sitrang-hit people's rehabilitation efforts
After widespread criticism that it failed to respond swiftly and effectively to misinformation, hate speech and harmful influence campaigns, Facebook set up the oversight panel as the ultimate referee of thorny content issues it faces. Members include a former Danish prime minister, the former editor-in-chief of British newspaper the Guardian, as well as legal scholars and human rights experts.
Some critics have previously questioned the board’s independence and said its narrow content decisions seemed to distract from wider problems within Facebook and concerns about government regulation.
1 year ago
Meta contributes over Tk1.5 crore for Sitrang-hit people's rehabilitation efforts
Meta has committed more than Tk1.5 crore to Bangladesh Red Crescent Society and Brac to help them with rehabilitation efforts for the communities impacted by cyclone Sitrang.
This commitment also includes funding to support disaster preparedness activities through Bangladesh Environment and Development Society in partnership with GlobalGiving.
Sitrang has damaged at least 10,000 homes in Bangladesh, and about 1,000 shrimp farms were washed away in floods.
These contributions, which are being made directly and from Meta's Crisis and Disaster Response Fund, will also go towards relief and disaster preparedness initiatives of the three Bangladesh-based non-profit organisations, the company said in a statement.
"Our thoughts are with the people of Bangladesh who are recovering from the devastating effects of cyclone Sitrang. We hope that our contributions will support local organisations that are working hard to rehabilitate people who were impacted," Jordi Fornies, Meta's director for emerging markets in Asia Pacific, said.
Read more: Cyclone Sitrang: Everything you need to know
People often rely on online platforms to stay connected during and after natural disasters. After the cyclone hit Bangladesh, Meta created a Crisis Response page on Facebook where people can use the Community Help feature to request help or offer support, such as food, shelter, and emergency evacuations. The platform also activated Safety Check, which enabled people to let their friends and family know that they are safe.
"Alongside connecting people, Meta’s platform Facebook also helped us to get real-time updates and information from the field. This hands-on information also enabled us to take effective decisions to support people in need in a timely manner. In addition, the financial assistance from Meta also contributed to our ongoing response and recovery effort for cyclone Sitrang and helped us to reach out to more people,” Kazi Shofiqul Azam, secretary general of Bangladesh Red Crescent Society, said.
Meta is also giving $100,000 in ad credits to local organisations to support their campaigns related to crisis response. This will also help their future preparedness plans for climate-induced disasters in Bangladesh.
"Communities living in the coastal areas of Bangladesh are at the forefront of devastating impacts of cyclones. Every year, the country is facing a new level of damage due to cyclonic storms. As such, cyclone Sitrang also left a trail of destruction in several coastal districts, including damage to shelters, croplands, and fish enclosures," Md Liakath Ali, director of the Disaster Risk Management Programme of BRAC, said.
Read more: Cyclone Sitrang weakens into depression
"BRAC acknowledges that disaster risk reduction requires support and humanitarian assistance to recover the losses. We thank Meta for its generous support in the wake of natural and human-caused crises, so that affected people will be able to meet their urgent necessities."
According to a recent World Bank Group report, average tropical cyclones cost Bangladesh about $1 billion annually. The research suggests that over 13 million Bangladeshis may become internal migrants in the next 30 years due to climate impacts.
2 years ago
Meta brings Facebook Reels to Bangladesh
After the launch of Instagram Reels in August, Meta has now brought Facebook Reels to everyone in Bangladesh, introducing short-form, entertaining video experiences and tools to creators and audiences on the Facebook app.
It is available for both iOS and Android users, Meta said in a statement Sunday.
"Reels gives people a new outlet to express their creativity with the ability to record videos, select music, and add photos and timed text. It helps creators expand the reach of their content, and for new creators to be discovered," it added.
Meta has expanded the availability of Facebook Reels for iOS and Android to more than 150 countries across the globe.
READ: Tweets with racial slurs soar since Musk takeover
"We're also introducing better ways to help creators to earn money, new creation tools and more places to watch and create Facebook Reels," the company said.
Meta is focusing on making Reels the best way for creators to get discovered, connect with their audience and earn money. The company also wants to make it fun and easy for people to find and share relevant and entertaining content.
Since Facebook Reels' launching in the US, it has seen creators like Kurt Tocci (and his cat, Zeus) share original comedic skits, author and Bulletin writer Andrea Gibson offer a reading of their published poetry, Nigerian-American couple Ling and Lamb try new foods and dancer and creator Niana Guerrero do trending dances, like the #ZooChallenge.
Bangladeshi content creators like Raba Khan, Ridy Sheikh, and Petuk Couple also shared their reels under the #ReelDeshi hashtag and invited others to create their videos to show what makes them real "deshis." These one-minute travel stories, dance challenges and recipes, show glimpses of the historic sites, music and food of Bangladesh.
READ: 'Kill more': Facebook fails to detect hate against Rohingya
"At Meta, we are always testing new ways for people to express themselves and entertain others. Reels have been inspiring Bangladeshi creators on Instagram, and now people on Facebook can discover more entertaining content, and creators can reach new audiences," Jordi Fornies, Meta's director of emerging markets in Asia Pacific, said.
"We are excited to see the creativity and connections that Reels will unlock for the Bangladeshi Facebook community."
According to Meta, video makes up more than 50 percent of the time spent on Facebook. "There is growing interest in watching fun, entertaining short-form videos, and expressing themselves by making their own."
2 years ago
‘Kothon Caravan’ seeks application from Bangladesh
With support from Meta (Facebook parent company), GenLab - a Bangladeshi youth-led social enterprise is curating a storytelling training and festival named 'Kothon Caravan' in association with UK-based storytelling organization Caravanserai Collective (CERITA).
Under the same partnership modality, the programme is being implemented in a notable few selected countries in Africa, and South & Central Asia.
The Bangladesh application ends on 12th December (To apply: https://linktr.ee/genlab_forms).
The opportunity offers Bangladeshi citizens aged about 18 and above an opportunity to build their storytelling capabilities and drive them toward creating a diverse, tolerant and inclusive Bangladesh.
Women and representatives from marginalized communities, people with disabilities and ethnic groups are highly encouraged to apply.
Interested applicants are encouraged to apply.
The programme offers hybrid training conducted by GenLab, supported by META and CERITA, the training modules would focus on the skill of storytelling, the use of storytelling, and the role of social media in pursuing a society of solidarity, diversity, and coexistence.
Trained Caravanners would curate micro-project ideas that align with the training perspectives, said a media release on Friday.
The facilitators would assist in the curation of sustainable and impactful ideas.
These micro-projects will include vibrant and impactful social media activities throughout its course to be implemented in Facebook and Instagram handles.
And the programme completes with the ‘Kothon Caravan’ Fest scheduled for later in December would be open for all arrangements featuring the caravan’s micro-projects followed by discussion and cultural involvements.
2 years ago
Is your Instagram crashing?
Users of the online photo-sharing and social networking service Instagram are reporting issues with the app.
Popular social media app users have shared that the app keeps crashing or closing abruptly.
Meta-owned Instagram lets users take pictures, apply filters to them and share those pictures in several ways, including through social networks such as Twitter and Facebook. It is available as an application for iPhone, iPad, and Android devices.
Read: Restricted from Twitter, Instagram; Kanye to buy conservative social network Parler
2 years ago