AME Net http://ame-net.com/ Mon, 07 Nov 2022 03:19:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://ame-net.com/wp-content/uploads/2021/11/icon-50x50.jpg AME Net http://ame-net.com/ 32 32 Demand for firearms and ammunition is reaching record highs, but some retailers are squeezed hard by supply chain bottlenecks https://ame-net.com/demand-for-firearms-and-ammunition-is-reaching-record-highs-but-some-retailers-are-squeezed-hard-by-supply-chain-bottlenecks/ Thu, 14 Jul 2022 09:04:50 +0000 https://ame-net.com/?p=5194 In 2020, when social instability was sparked by political tensions and the ongoing COVID-19 outbreak, Coloradans purchased firearms in historic numbers Ipass Loans.

According to Colorado Bureau of Investigation transaction figures, the trend continued into 2021 with just a tiny decrease in momentum.

The CBI registered 487,097 firearm purchases through 2020, above the 335,370 sales recorded in 2019.

Although the agency has not yet published statistics for December 2021, 401,903 weapon sales happened in November.

While some La Plata County gun businesses prospered in 2021, others battled to maintain adequate stock levels due to supply chain concerns that impacted certain stores more than others.

Agnus Geiger, the founder of Gunfighter Organization LLC, oversees four businesses.

A full-service retail establishment, Colorado Gunfighter, sells weapons, ammo, and clothing.

Then there’s Gunfighter Development Group, which provides professional instruction; Tac-Con, which develops assisted reset rifle triggers; and Gunfighter Cartridge Co., which manufactures ammunition and componentry.

Geiger relocated to a more extensive facility in Durango approximately 18 months ago, increasing his store floor area. The gun store is doing well so well that he is considering expanding into nine other states.

Geiger, a former United States Army staff sergeant, ascribed the increase in gun purchases to political turmoil and fear of violence.

“Also’s unfortunate that violence fuels the weapons market, but it fuels our training firm,” he said. “As a result, an increasing number of individuals are being trained.

“I watch all the riots and all the other things that occurred last year when they were burning down cities, and people are purchasing more and more firearms,” he said.

Geiger claimed his firearm-training simulator is “booking up like crazy despite the website’s lack of advertising.”

“It is well within people’s rights to defend themselves, which I applaud, and they are equipping themselves appropriately,” he added. “On the plus side, they are also educating themselves.”

Geiger is not alone in his frustration with supply chain challenges in the firearms and ammunition industry.

Rocky Mountain Pawn and Gun owner Bruce Dominey claimed supply chain issues and the COVID-19 outbreak are pushing him out of business.

“It’s destroying us,” he said. “After 30 years, we’re closing our doors.”

Dominey said that his pawn store sold a substantial amount of firearms in 2021 at least until he ran out of weapons to sell.

“The supply channels are all messed up, and as a result, we can’t acquire any,” he said. “We are unable to get goods. I am unable to order firearms and ammunition to replenish my inventory.”

A Durango Goods for the Woods staffer described gun supply as “poor,” however, rifles and ammunition are “coming in” again this January.

Rocky Mountain Pawn and Gun’s lending business also suffered a setback. Dominey said that before COVID-19 lockdowns, loans would be made and then returned with interest at the end of the month.

However, when more people stayed at home, particularly since government cheques were sent to residents while businesses were closed, they cut down on spending, he said.

After 30 years in Durango, Dominey said he is open to new opportunities and plans to go to Texas.

“We’re going to shut her down,” he said.

According to Powder Valley Inc., a distributor of guns powder, primers, and accessories, only four firms Winchester, Remington, Federal, and CCI make primers for civilian, law enforcement, and military use in the United States.

“People being ill, skipping work to care for their children, and self-quarantining from factory employees to delivery drivers, and across the supply chain contributed to a pause in production this spring,” Powder Valley said in October its corporate blog.

Geiger utilizes a series of Mark 7 bullet reloading machines to feed bullets into storage bins. However, the Mark 7’s are gathering dust due to supply chain bottlenecks.

“These are Mark 7 machines; they can fire 3,500 rounds per hour,” Geiger said. “When they’re running, it’s incredible because they’re earning money, but they’re simply sitting here now.”

]]>
Engineering students explore snowplow data to assess Toronto’s response to winter storms https://ame-net.com/engineering-students-explore-snowplow-data-to-assess-torontos-response-to-winter-storms/ Tue, 10 May 2022 16:49:57 +0000 https://ame-net.com/engineering-students-explore-snowplow-data-to-assess-torontos-response-to-winter-storms/

Last January, when 55 centimeters of snow blanketed Toronto in just 15 hours, the city’s snow removal fleet seemed to be struggling to keep up. But was it really different from other storms, or did it just seem so?

For three students from the Faculty of Applied Science and Engineering at the University of Toronto who were taking “Data Science for Engineers”, a graduate course taught by Sebastien Goodfellowassistant professor in the department of civil and mineral engineering, it was the perfect case study to test their new skills in numerical calculation.

“There was a lot of media coverage at the time saying the city had responded poorly,” says Katia Ossetchkina, a master’s candidate. “We wanted to see if there was a way to analyze the movement and distribution of snowplows and salt trucks across the city.”

Real-time data on the locations of Toronto’s more than 800 snow plows and salt trucks is publicly available during the winter months. There’s even a website that tracks this data on a map. But the team – which also included master’s candidates Thomas deBoer and Lucas Herzog – soon realized they needed more.

“There is no historical storage,” de Boer explains. “You can’t just download it as a file, so we had to create an algorithm that would ping that web server, download the data, and store it on our computer, which we could then use to create our own historical database and do our analysis from that.

By the time the team set up their technique, it was too late to collect data on the January storm. But by analyzing data from subsequent storms — and gleaning statistics about previous ones in the city and local news reports — researchers were able to verify that the city’s response was improving as winter progressed.

“We learned that Toronto had increased the number of snow plows on the road in February compared to January, and crews were faster in meeting certain benchmarks, such as the percentage of roads that had been cleared of snow at some point during the storm,” says de Boer.

PlowTO shows live snowplow locations around town during the winter months.

Herzog says the team also spotted other interesting trends.

“Of course they clear the arteries first, but we saw that they would stop clearing around 6 a.m., just before the morning commute,” Herzog says.

“And that’s where a lot of these Twitter complaints are coming from,” de Boer adds. “People were wondering how they’re supposed to get to a thoroughfare when the street in front of their entrance is blocked by two feet of snow.”

Stimulated by these kinds of observations, the team decided to take the project a step further by applying their data analysis to Twitter messages. The team used Twitter’s Application Programming Interface (API) to gather feedback from those tweeting at Toronto’s 311 and the City of Toronto’s Winter Operations account. They were then able to carry out what is called a “sentiment analysis”, measuring whether the words used in the tweets were positive or negative.

This allowed the team to compare the public’s response to the January storm to one that occurred in February.

“We saw a lot of negative tweets in January with people complaining about not having service yet, and that also came with a lot of geographic information, so we could see the hardest hit areas,” Ossetchkina says.

“Then we saw this reverse trend in February where people were saying ‘thank you’ and saying the city was doing a good job in specific regions. That was a very interesting performance metric.

The team says this kind of data analysis could help other engineers on future projects. They made their historical database publicly available and even wrote detailed instructions so other teams could replicate their approach.

Goodfellow says he was very impressed with the students’ work.

“What I love about this project is that it’s very unique,” ​​he says. “This is a new dataset that the students made public and can now be used by other engineers to investigate new questions or to hone their data science skills.

“Even better than that, it was a dataset of the city we all live in that provided a special motivation for students to really push beyond.”

]]>
FLOSUM HOSTS 2022 VIRTUAL SECURITY SUMMIT ON MAY 12 TO DISCUSS SALESFORCE ECOSYSTEM https://ame-net.com/flosum-hosts-2022-virtual-security-summit-on-may-12-to-discuss-salesforce-ecosystem/ Thu, 05 May 2022 17:36:00 +0000 https://ame-net.com/flosum-hosts-2022-virtual-security-summit-on-may-12-to-discuss-salesforce-ecosystem/

Sessions include presentations by Taher Elgamal at Salesforce, Brian Boardman at IBM, Rachel Beard at Salesforce and the Flosum team

SAN RAMON, Calif., May 5, 2022 /PRNewswire/ — Flosuma leading provider of secure end-to-end DevSecOps, data management, data protection and security automation platforms built on Salesforce, today announced its annual Flosum Security Summit, which will take place virtually on Thursday May 12 from 1-4 p.m. ET.

www.flosum.com (PRNewsfoto/Flosum)

Flosum, a leading Salesforce-based DevSecOps provider, will host its annual Flosum Security Summit on May 12.

Flosum Security Summit 2022 will feature industry leaders who will showcase the best methods, tools and strategies to help organizations stay compliant, meet regulatory laws and ensure data security.

“Staying secure and protecting your data as an organization is a shared responsibility among partners, customers, and vendors in the Salesforce ecosystem,” said Girish Jashnani, CEO of Flosum. “Our Security Summit will highlight the challenges and solutions for maximizing security within Salesforce.”

Flosum Security Summit 2022 offers an impressive line-up of speakers and topics, including the following:

  • Building a culture of cybersecurity – Taher ElgamalCTO, Security at Salesforce

  • Co-create value-driven customer and employee experiences with confidence – Brian Boardmanpractice area leader at IBM Federal

  • The State of Salesforce Security in 2022 – Rachel BeardDistinguished Security Technical Architect at Salesforce

  • Zero Trust Security for Salesforce – Matt St. Onge, Director of Sales Engineering at Flosum

  • Best practices for securing your Salesforce instance – Austin MehallDirector of Salesforce Development at Flosum

The event will also include an introduction to Flosum Trust Center, a fully integrated security solution to monitor, alert and analyze any potential threat in a Salesforce environment.

Register here for the Flosum Security Summit 2022. Recording will be available for those who register but cannot attend live. To view the full agenda and find out more about the event, visit: https://discover.flosum.com/2022-salesforce-security-summit.

About Flosum

Flosum is a leading provider of end-to-end secure DevSecOps, data management, data protection and security automation platforms built 100% natively for Salesforce. Our mission is to empower IT managers to manage the cloud with confidence and empower developers to innovate using Flosum version management, Salesforce data backup and recovery, and Salesforce security solutions. Companies around the world use Flosum to accelerate digital transformation by making the software release process quick and easy and increasing developer productivity while staying secure and compliant. For more information, visit www.flosum.com.

Salesforce and others are registered trademarks of Salesforce.com, Inc.

Quote

Quote

Show original content to download multimedia:https://www.prnewswire.com/news-releases/flosum-hosts-virtual-security-summit-2022-on-may-12th-to-discuss-salesforce-ecosystem-301541116.html

SOURCE Flosum

]]>
Conjura raises €15m Series A to launch the future of e-commerce with AI-powered analytics https://ame-net.com/conjura-raises-e15m-series-a-to-launch-the-future-of-e-commerce-with-ai-powered-analytics/ Tue, 26 Apr 2022 11:00:00 +0000 https://ame-net.com/conjura-raises-e15m-series-a-to-launch-the-future-of-e-commerce-with-ai-powered-analytics/

DUBLIN–(BUSINESS WIRE)–Conjura, the leading e-commerce data analytics platform, today announced it has raised €15 million in Series A funding. The round was co-led by Act Venture Capital and MiddleGame Ventures, with the participation of Tribal VC. Conjura will use the capital to enhance its e-commerce solutions to meet the industry’s ever-growing digital needs in a multi-product solution. The company also announced plans to expand into the UK and Ireland and expand into multiple international markets simultaneously.

Conjura is pioneering the next generation of e-commerce data analytics by empowering businesses of all sizes with enhanced visibility into all of their operations on a single cloud-based platform, as opposed to multiple and disconnected tools. Traditionally, e-commerce data analysis has been confined to an operations section, such as marketing or customer retention. Conjura’s real-time analytics overcomes this limitation by combining data from fulfillment, warehousing and supply chain sources, with online/offline sales and market transactions and customer metrics. . Conjura’s new integrated approach to data analytics delivers automated insights and recommendations on all aspects of e-commerce to accelerate overall business performance.

“Last year, consumers spent more than $175 billion online with UK merchants, and these sales are expected to increase by more than 10% in 2022, representing a market value of at least $200 billion. The relentless growth of e-commerce combined with rising customer expectations means the need for businesses to maximize performance and differentiate themselves from their competitors has never been greater,” said Fran Quilty, CEO and Co-Founder of Conjura. “Our unique approach to data analysis and our ability to provide complete visibility into a company’s performance in an easy-to-use platform represent the next generation of e-commerce solutions. We provide the tools businesses need to improve operational efficiency, as well as ensure they deliver the seamless digital experience that today’s consumers expect.

“E-commerce businesses need data platforms that address all aspects of their business and Conjura uniquely delivers this capability,” said Debbie Rennick, general partner at Act Venture Capital. “It is a data-driven decision-making platform designed for next-generation e-commerce. Their strong growth to date speaks to the market opportunity and real need for Conjura’s approach. The investment will accelerate this growth and we are excited to support the team and the opportunity ahead.

Patrick Pinschmidt, General Partner of MiddleGame Ventures, said, “Conjura is redefining e-commerce analytics by solving real industry problems, paving the way for a host of innovative solutions to drive more effective performance through actionable insights and additional integrated services. of its advanced technology. He added, “By offering an integrated analytics solution, Conjura is well positioned to add additional solutions that help businesses improve customer engagement, back-office operations and financial management. We are thrilled to partner with the exceptional team at Conjura to realize the great promise of their platform for the e-commerce industry.

Conjura’s platform combines benchmarking, business reporting, and data science to propel e-commerce businesses forward. Businesses can benchmark their business performance against industry standards, receive in-depth reports to highlight their strengths, weaknesses and growth opportunities, as well as make smart decisions across all departments based on one view. 360 degrees of their operations. Conjura seamlessly integrates with all major e-commerce tools and systems, providing a scalable, fully integrated data infrastructure for businesses.

Conjura is fast becoming the performance analytics platform of choice working with fast growing brands such as CurrentBody, Naturecan, Wild, Carbon Theory, Saint+Sofia, Percival, CleanCo, across multiple eCommerce industries.

About Conjura

Conjura is pioneering the next generation of e-commerce data analytics, harnessing advanced AI to empower businesses of all sizes with increased visibility into all of their business operations. The Conjura platform helps companies integrate, benchmark, and leverage their data to unlock growth opportunities.

Conjura was founded in Ireland in 2018 and supports leading high-growth e-commerce brands across many industries. For more information visit: https://conjura.com/

About Act Venture Capital

Act Venture Capital is an early-stage venture capital firm that partners with fledgling founders and established teams that scale their growth. We have raised approximately 600 million euros through several funds to support this objective. We are looking for innovative and categorized companies in major markets. We help founders build and scale these businesses by leveraging the support experience of over 200 founders. https://actventure.capital/

About MiddleGame Ventures

MiddleGame Ventures (MGV) is a fintech venture capital firm focused on supporting world-class teams building transformative businesses that will reshape financial services for decades to come. MGV is headquartered in Luxembourg with offices in Dublin, London and Washington DC. For more information visit www.middlegamevc.com

]]> DiNapoli Op-Ed in Crain’s New York Business https://ame-net.com/dinapoli-op-ed-in-crains-new-york-business/ Mon, 25 Apr 2022 15:12:32 +0000 https://ame-net.com/dinapoli-op-ed-in-crains-new-york-business/

Crain’s New York Business today published an op-ed by New York State Comptroller Thomas P. DiNapoli on the need for greater transparency and accountability in budgeting for the use of federal relief funds. The full article is below:

What are New Yorkers getting from federal Covid relief funds?

The new state budget reflects robust tax revenue growth and an unprecedented injection of federal funds.

Federal assistance has stabilized state finances, allowing New York to weather the impacts of the Covid-19 pandemic and preserve key services while providing the opportunity to emerge stronger than before . A recent report from my office found that all states received more money than they sent to Washington, D.C., in taxes in federal fiscal year 2020. This was true even for New York. , which was among the biggest net losers in tax dollars.

But the extraordinary federal assistance is temporary and should be targeted at programs that ease the pains of the pandemic and improve the lives of all New Yorkers. Developing a clear plan and ensuring funds are used efficiently, effectively and equitably should be top priorities. Unfortunately, the new budget fails to provide the transparency and accountability essential to these efforts.

The state continues to rely on large block appropriations for certain federal funds, which gives the Budget Control Division control over the timing and target of billions of dollars. Detailed appropriations would allow for greater visibility into the intended use of funds, more accurate reporting, and assurance that the state has a well-thought-out plan for the money.

Federal guidelines for fiscal stimulus funds from the U.S. bailout — totaling $12.7 billion for New York City — require reporting on plans to address inequality, including how projects will benefit affected communities from disproportionately by the pandemic, as well as details on expenditure, performance and arrangements to ensure results are achieved.

Wanted: Details

The original New York plan affirmed broad principles, but it did not contain specific dollar amounts or detailed information on how the goals would be achieved. Other states are doing a better job, enacting stand-alone apportionment bills, promoting public engagement and releasing comprehensive plans for federal aid.

State-by-State Comparison of National Conference of State Legislatures ARP Fiscal Stimulus Fund Allocations Shows Expenditure Descriptions for Eligible Categories, Including Broadband Access, Economic Aid and Development , education, housing, water infrastructure and workforce development. The New York allocation shows only one category: state operations and administration.

No other major state has disclosed less about its use of the funds. New York has not demonstrated how it will deploy funds to improve equity, measure the success of its efforts, or ensure funding reaches the individuals, families and businesses most affected by the pandemic. We must do better.

I’ve published reports on major relief initiatives, including the Federal Paycheck Protection Program for Small Businesses and the Emergency Rental Assistance Program. I also developed a Covid-19 relief fund tracker to provide monthly updates on major initiatives. However, my office’s ability to shed light on the use of funds is limited. The money came out slowly for some programs and hard to follow for others. Information on results is minimal and performance evaluation is difficult.

Perhaps the silver lining of the pandemic is that we have an opportunity to lay the foundations for a better economic future and fairer outcomes. We must intensify our efforts to achieve this. I am committed to doing my part to keep the public and stakeholders informed of our progress.


Track state and local government spending on Open Book New York. As part of State Comptroller DiNapoli’s open data initiative, search millions of state and local government financial records, track state contracts, and find frequently requested data.

]]>
Changes in Neural Processing and Rating of Negative Facial Expressions After Administration of Open-Label Placebo https://ame-net.com/changes-in-neural-processing-and-rating-of-negative-facial-expressions-after-administration-of-open-label-placebo/ Thu, 21 Apr 2022 12:24:10 +0000 https://ame-net.com/changes-in-neural-processing-and-rating-of-negative-facial-expressions-after-administration-of-open-label-placebo/

Speakers

A total of 109 right-handed women were recruited from a non-clinical sample at a university and via social media. The participants were university students (77%) or white collar workers (13%). Inclusion criteria for participation included female gender and age ≥ 18 years. Exclusion criteria were self-reported diagnoses of mental illnesses, neurological disorders, and psychotropic medications. Data from six participants were removed from the analysis due to excessive EEG artifacts (i.e., less than 70% artifact-free trials). The final sample included 103 women (Mage= 22.48 years, SD = 3.35; all Caucasian, right-handed). Participants were randomly assigned to the OLP group (Mage= 22.46 years, SD = 3.01) or the CG (Mage= 22.49 years, SD = 3.70). The sample was restricted to females due to gender effects in the context of emotional processing and placebo responsiveness32.

The experiment complied with all relevant ethical guidelines and regulations involving human participants and was approved by the ethics committee of the University of Graz (Austria; GZ. 39/98/63 ex 2020/21). All participants gave their informed consent before participating. Eligible participants had to come to the laboratory.

Experimental design and procedure

The study was pre-registered on the Open Science Framework (https://osf.io/cxfgb/?view_only=5f4c6d564) on April 30, 2021, and conducted between May 25 and August 30, 2021 at the University of Graz (Austria) . Additionally, the study has been registered as a clinical trial in the German Clinical Trials Register (DRKS00028129; February 18, 2022).

Participants were invited to an affective processing EEG study (no information about placebos was provided in the invitation). In the preparation room, a researcher used a random number table to assign participants to the OLP or control group. Both groups viewed a presentation (15 PowerPoint slides with figures and text; no audio; fixed duration per slide: 30 s; see supplementary material). Those in the OLP group received information about the neurobiological effects of placebos with an emphasis on affective processing (results from EEG and fMRI studies). The control group watched a presentation on affective neuroscience (results of EEG and fMRI studies). The two presentations were comparable by number of slides, numbers, number of words). Next, participants rated the presentation (0 = not interesting; 10 = very interesting) and their current affective state (0 = not positive; 10 = very positive).

Afterwards, a short video (duration: two minutes; female presenter) introduced the nasal spray. It was clearly stated that the nasal spray did not contain any ingredients other than saline and water. The OLP group was told that the nasal spray could help reduce emotional reactions when viewing images with angry facial expressions, while the CG was told that the spray would improve EEG recording. The information was summarized by the experimenter, who helped deliver the nasal spray once into each nostril. Participants in the PLO group rated their expectations regarding the effectiveness of PLOs (“What do you think? How effective will the PLO be?” 0 = not effective; 10 very effective).

Then the participants were brought to the EEG laboratory. Two experimenters, who were unaware of the group assignment, conducted the EEG experiment. At the end of the EEG experiment, participants were asked to rate the effectiveness of the nasal spray (“How effective do you think the nasal spray was?” 0 = not effective; 10 very effective).

Image viewing task

Participants viewed a total of 60 images of Karolinska’s directed faces33. Thirty images depicted angry facial expressions; 30 images of neutral facial expressions; 50% of the faces were male; 50% women). The images were presented in random order. In each trial, participants first viewed a blank screen (500 ms), a staring cross (500 ms), and then an image with a facial expression (6000 ms). Eight images (four angry/four neutral expressions; 50% male/female) were rated for valence, arousal, and perceived intensity of anger (0-100; 0=I do not feel pleasant, calm, I do not perceive anger; 100: I feel very pleasant, excited, I perceive intense anger). The images for the evaluations had been randomly selected before the experiment; notes were to be provided at random times during the experiment.

Electrophysiological recording and data analyzes

Continuous EEG activity was recorded using the actiCHamp system (actiCHamp, Brain Products GmbH, Gilching, Germany) with 63 active actiCAP pressure electrodes (according to the 10–10 system) and the BrainVision Recorder (version 1.21). The reference electrode was placed on the FCz position, the ground electrode on the FPz position. An electrolytic gel was applied to each electrode to keep electrode impedances below 10 kΩ. The EEG was recorded with a sampling rate of 2500 Hz and a bandwidth of 0.016 at 1000 Hz. For the analysis of the raw data, the BrainVision analyzer (version 2.2.1) was used. The sampling rate was changed to 250 Hz. The data was referenced to the linked mastoid electrodes (i.e. TP9, TP10). Eye movement artifacts were corrected through the implemented ICA eye correction software – only components corresponding to horizontal and vertical eye movements were selected based on their shape, timing and topography matching. Other episodes of artifacts were excluded after visual inspection. Six participants were excluded from the analysis due to a large number of artifacts (< 70 % de segments sans artefact). Pour les participants restants, le pourcentage d'essais sans artéfact ne différait pas entre les groupes (OLP : M = 93,53 %, SD = 7,99 ; CG : M = 93,50 %, SD = 7,75 et les types d'images (colère : 93,51 %, SD = 7,40 ; neutre : M = 93,53 %, SD = 6,53) (tous p > 0.05).

Data were segmented into 6200 ms intervals (200 ms pre-stimulus onset, 6000 ms post-stimulus onset) and corrected to the 200 ms pre-stimulus baseline. An off-line high pass (0.01 Hz) and low pass filter (30 Hz cutoff frequency, 24 dB/octave attenuation) has been applied. Data were averaged for all groups and conditions separately. Based on previous literature and visual inspection of general averaged waveforms, we extracted ERPs for time windows 400–1000 ms (early LPP) and 1000–6000 ms (LPP late) after the start of the picture. Mean amplitudes were aggregated over a centroparietal group (C3, C1, Cz, C2, C4, CP3, CP1, CPz, CP2, CP4, P3, P1, Pz, P2, P4) and a frontal group (AF3, AFz, AF4, F3, F1, Fz, F2, F4, FC3, FC1, FC2, FC4).

memory task

After viewing the images, an unannounced memory task was conducted with 16 images (8 angry expressions, 8 neutral expressions; 50% male, 50% female). Eight of the images had been shown in the experiment; the other images were new distractors. The images had been randomly selected before the experiment (and were different from the images assessed (affective ratings) during the experiment). Participants were asked to decide if they had seen the image in the experiment (yes/no).

Data analysis

All statistical analyzes were performed with SPSS (version 28). The investigator who analyzed the data collected in the study was unaware of the treatment applied to the groups. Mixed factor analyzes of variance (ANOVA) were calculated with group (OLP, CG) as between-subjects factor and face (anger, neutral) as within-subjects factor for image ratings, memory performance and EEG data. Cohen’s d is reported as a measure of effect size. Significant interaction effects (group x face) were followed by adjusted t-tests for multiple comparisons (Bonferroni-Holm).

T-tests compared the groups regarding their scores for the PowerPoint presentations (rated interest), emotional state before the experience, and perceived effectiveness of the nasal spray. All tests were two-sided and used a significance level of p < 0.05.

]]>
Why Data Analytics is Important in Digital Marketing https://ame-net.com/why-data-analytics-is-important-in-digital-marketing/ Thu, 21 Apr 2022 02:21:17 +0000 https://ame-net.com/why-data-analytics-is-important-in-digital-marketing/

Digital marketing is all about data. As soon as a customer clicks on an ad or searches for a keyword, companies collect data that can be used to improve their marketing efforts. But what good is all this data if it cannot be properly analyzed? This is where data analysis comes in.

Data analysis is the process of looking at data and extracting meaning from it. There are many different types of data that can be analyzed in digital marketing, such as website traffic data, social media data, search engine data, and email marketing data. Each type of data can provide valuable insights and by understanding what the data means, businesses can make better decisions about their marketing strategies.

Here are the main reasons why data analytics is so important in digital marketing:

1. Data analysis enables in-depth understanding of customer behavior

Data analysis helps digital marketers better understand the behavior of their customers. This includes understanding how customers interact with different digital touchpoints, what factors influence their purchasing decisions, what type of messaging they respond to, what type of content interests them, and when they are most active online. line. This information is then used to develop targeted marketing plans and strategies aimed at improving the customer experience and driving conversions.

CRM data analytics is a perfect example of how data analytics can be used to understand customer behavior in digital marketing. If you use a CRM software, your CRM will contain information about customer interactions with the brand through phone calls, emails, live chat sessions or website visits, their preferences and purchase history. By analyzing CRM data, companies can identify pain points in the customer experience and take action to address them. This can lead to increased customer satisfaction and loyalty and ultimately increased sales and revenue.

2. Data analysis helps identify marketing opportunities

Data analysis can also be used to identify new marketing opportunities. For example, by analyzing website traffic data, companies can identify which channels drive the most traffic and conversions, and then focus their marketing efforts on those channels.

If a business finds that their Facebook is driving more traffic to their website than their email campaigns, they may decide to invest more in promoting Facebook by leveraging the power of videos. There are many online video creation tools which can be used to create great videos.

Likewise, by analyzing social media data, businesses can identify any negative sentiment around their brand and take steps to improve their social media presence.

Data analysis can also be used to identify any new marketing trends that could be exploited for competitive advantage. For example, if a business sees that its competitors are investing heavily in a certain type of marketing, like video marketing, it may decide to do the same in order to stay ahead of the competition.

3. Data analysis enables more effective budgeting

Data analytics can also be used to allocate digital marketing budgets more efficiently.

By analyzing data from past campaigns, businesses can see which channels have been most effective in driving results and where there is potential for improvement. This information can then be used to allocate future marketing budgets in a way that maximizes return on investment.

Likewise, by understanding which customers are most valuable, businesses can focus their marketing efforts on acquiring and retaining those customers.

Digital marketing is expensive and businesses need to make sure they are spending their marketing budgets as efficiently as possible. Data analysis is the key to achieving this.

4. Data analysis helps develop targeted marketing strategies

Data analytics can be used to segment customers based on their behavior, preferences, and needs. This helps develop targeted marketing strategies that are more likely to resonate with each individual customer. By analyzing customer data, businesses can see which demographics are most interested in their products or services and tailor their campaigns accordingly.

For example, if a company finds that its products are most popular with women between the ages of 25 and 34, it may decide to target its marketing efforts to this demographic. Likewise, by analyzing customer data, companies can identify any trends in customer behavior. For example, if a company finds that its customers are increasingly interested in sustainable products, it may decide to focus its marketing efforts on promoting its sustainable products.

5. Data analytics helps personalize the customer experience

Personalize the customer experience and improvement of the customer satisfaction score becomes increasingly important as customers become increasingly inundated with marketing messages.

Data analysis is the key to providing a personalized experience that will stand out from the crowd. By understanding each customer’s individual needs and preferences, businesses can deliver a more personalized experience that is more likely to lead to conversions.

For example, if a business sees that a particular customer is interested in Product A, it can decide to provide that customer with information about Product A. By analyzing customer data, businesses can see which products or services are the popular with specific customers and tailor their offerings accordingly.

6. Data Analysis Helps Evaluate Marketing Performance

Digital marketing is constantly changing and it can be difficult to keep up with the latest trends. Data analytics can be used to assess the performance of different marketing channels and strategies. This information can then be used to make changes to the marketing mix to improve results.

Data analytics can also be used to track and measure the performance of marketing campaigns. By analyzing the data, businesses can see which campaigns are most effective in terms of reach, engagement, and conversions. This information can then be used to optimize future marketing efforts.

7. Data analysis helps identify business trends

Data analysis can also be used to identify business trends. By analyzing the data, businesses can see which products or services are growing in popularity and adjust their offerings accordingly.

Likewise, by understanding customer behavior, businesses can identify any trends in customer needs or preferences. This information can then be used to develop new products or services that meet these needs.

Conclusion

Data analytics is a powerful tool that can be used to improve the effectiveness of digital marketing. Certainly, this will become increasingly important as businesses strive to stay ahead of the competition in the digital age. It is important for you to have skilled data analysts on your team to make the most of this opportunity. A professional data analyst can help you collect, analyze, and interpret customer data so you can make better decisions about your digital marketing efforts.

]]>
3 Tips for Running Better Ad-Hoc Analytics on Business Financial Data https://ame-net.com/3-tips-for-running-better-ad-hoc-analytics-on-business-financial-data/ Mon, 18 Apr 2022 10:36:07 +0000 https://ame-net.com/3-tips-for-running-better-ad-hoc-analytics-on-business-financial-data/

Performing ad hoc analysis is very important for modern business enterprises

As one of the best sources of data for business analysis, a company’s financial data forms the basis for working capital projections, required reporting to tax authorities, and even overall business strategy.

Indeed, financial data contains a wealth of information for companies looking to increase their footprint. A company’s financial condition provides FP&A teams with the ability to empower CFOs with the insights they need to build resilient businesses.

However, analyzing financial data is difficult, especially on an agile basis. As companies collect more data than ever before, gaps in data governance and data quality are hampering analysts’ ability to dig deeper into datasets. You can’t trust the results of your what-if scenario projections if you only have access to the last quarter’s metrics.

As the pace of business accelerates, senior executives need on-demand information to make informed decisions about market priorities and opportunities as they arise. The need for fast insights therefore sheds light on ad hoc reporting, where analysts must query and retrieve data on the fly.

When coupled with deficiencies in data analysis processes and lack of training in data science principles, most companies run the risk of drawing the wrong conclusions. Here are three tips for doing better ad hoc analysis of your financial data.

Review revenue data sources

Companies these days collect revenue from dozens of sources, making ad hoc analysis difficult. Revenue data comes in many formats and structures, and it’s difficult to ensure that this data conforms to your own storage patterns.

However, the problems with these datasets go far beyond storage issues. Often, data from revenue sources arrives in forms that do not lend themselves well to ad hoc analysis.

For example, revenue collected from iOS or Google’s Play Store arrives as a lump sum of data, broken down only by transaction with no further context. Examining the context of the data set forces you to dig deeper into application interaction metrics and customer data repositories if you want to correlate them effectively with revenue trends. In contrast, POS revenue data tends to be detailed and contain high granularity.

Standardizing the granularity of revenue data is essential if you want to run analyzes of all data sources. If you don’t, you’ll create data silos that provide incomplete views of your revenue. Worse still, you will have to manually transform and transfer data from one source to another before running analyses.

These processes introduce analysis errors that lead to misunderstanding. These tedious processes only add friction to ad hoc analyses. For example, you can’t run ad hoc reports on datasets stored in spreadsheets, because the rigid nature of a spreadsheet doesn’t allow you to change parameters on the fly or easily dig into datasets stored elsewhere with new queries.

You also can’t integrate real-time revenue data into your reports, since each silo will be updated at different times? Standardizing the granularity of revenue data may result in loss of information on some sources. However, it will allow you to automate data collection and cleaning, allowing you to have flexible ad-hoc reporting functionality.

Use outcome measures

Analysts often struggle to convey the impact of their models on the business. A big reason for this is the use of irrelevant metrics. For example, using a non-quantifiable metric such as customer satisfaction to convey the impact of an innovative financial model is pointless in an ad hoc report because senior executives cannot directly link customer satisfaction to sales and revenue.

Outcome measures such as ROI and IRR cut through the fog and speak directly to the performance of a company’s investments and projects.

For example, comparing YOY revenue growth is the norm at most companies. However, if your business is a high-growth startup, it makes more sense to compare quarterly revenue instead of yearly numbers in your ad-hoc reports. Some start-ups will benefit from comparing monthly revenue because the growth they experience is exponential.

While ad hoc reporting gives you the freedom to explore data on the fly, it’s important to define the metrics you’ll use in advance to ensure you’re always measuring the right results. In short, explore your data, but be careful when adopting metrics or benchmarks as you go.

When using these metrics, make sure you understand what they convey. For example, ROI and IRR quantify returns, but they measure completely different conditions. ROI measures the overall return, while IRR measures the equivalent discount rate in a NPV calculation. In ad hoc scenarios, ROI can provide better insight into the longer-term goal of an IRR. The IRR requires time and discount rate inputs to give it more context.

In ad hoc scenarios, the hasty estimation of these figures can generate errors in the calculations which exaggerate the results. The discount rate you arrive at (the goal of an IRR calculation) can be wildly off track over time. Macro factors such as central bank interest rates can make your calculations stale. A simpler return on investment calculation will provide more flexibility and a quick overview of the attractiveness of an investment.

Quantify all data points using dollar amounts for maximum impact in your reports and use data visualizations to communicate results when reporting to senior executives. Although the financial world relies on data tables, they do not lend themselves well to quantifying business impact on non-financial audiences.

Consider revenue model biases

When running ad hoc reports, pay attention to the impact of bias on the final results. By definition, ad hoc reports look at business impact right now and use real-time data. However, the diversity of a company’s revenue models can significantly skew the results.

For example, analysts at a company that relies on SaaS revenue will typically witness consistent trends for the most part. You’ll see costs spread relatively evenly, while revenue will also be predictable, assuming the company’s products see traction and the acquisition funnel is buzzing.

However, a freemium model works differently. Revenue per user will increase significantly as users upgrade to premium features after the free trial period in cohorts. Costs per user are also higher initially in this model because the business will collect less revenue, leading to skewed margins per user.

So, when designing ad hoc reports, you need to decide whether it makes sense to measure business impact per user, given the volatility of user counts.

Keep the big picture in mind by considering seasonal trends and pricing model adjustments when designing ad-hoc reports. After all, a company’s revenue model can also create trends that impact consumer behavior. So, always consider the context in which revenue and financial data is collected.

Challenging but rewarding

Financial analysis is difficult, but the results of ad hoc financial reports help a company capture a snapshot of its performance. Remember to review biases in the data and install appropriate governance processes before drawing conclusions from ad hoc reports. Follow the tips in this article and you’ll get in-depth insights that will future-proof your business.

]]>
Teen Builds Database to Help Alabama Foster Families https://ame-net.com/teen-builds-database-to-help-alabama-foster-families/ Sun, 10 Apr 2022 04:27:17 +0000 https://ame-net.com/teen-builds-database-to-help-alabama-foster-families/

When Pranav Somu first heard about the opportunity to volunteer for the North Alabama Foster Closet, he knew nothing about the organization. He didn’t even know much about the foster family itself.

But after doing some quick research, he learned that half of all adoptive parents give up within a year due to a lack of resources like those provided by the foster closet, like diapers, cribs, seats. cars, clothes, shoes, toys and more, free.

Now, he’s almost as passionate about fostering as he is about programming. Since the summer of 2020, Pranav has spent over 300 hours creating a program that the organization can use to track the tangible items families receive.

Kimberly DuVall, founder and executive director of the nonprofit in Harvest, Alabama, describes the foster closet as “like a free thrift store” for foster, adoptive and kin families who have often need literally everything for the children in their care. It is not uncommon for a child to enter foster care with only their clothes on their backs. The reception office responds to approximately 100 requests per month.

“I realized how important it is to provide a safe and stable environment for children who cannot live with their parents,” says Pranav. “I wanted to help the cause, especially since the pandemic posed even more challenges.”

The result of his volunteer work is an invaluable gift that will help families in northern Alabama. “It was a really helpful way to use my skills, a way to give back to the community and connect to a real-world issue.”

Pranav, a junior at James Clemens High School in Madison, Alabama, started the project after Emily Harris, a chemistry teacher at his school who volunteers for the welcome closet, walked into the computer science department. to get help. She was hoping to find someone to create a database to replace the Google Form that volunteers and families used, and Pranav jumped at the chance to help her.

“The Google Form was enough when the organization was serving a handful of families,” he says. “But now it has thousands of entries and is slow to use. It takes time and manual labor. They needed a better data management and communication system. But for a nonprofit that relies on grants and volunteers, a subscription to such a system would be prohibitively expensive.

The Home Closet is a grassroots effort that “grew and grew and grew” after Kimberly started it five years ago, almost by accident.

She and her husband had welcomed and then adopted two children in Colorado before moving to Alabama. “We had a hard time getting in touch with the adoptive and foster parents,” she says. Shortly after starting a Facebook page, she “brought strangers to my house, bringing bins of clothes and shoes.”

Before she knew it, she was inundated with donations – as she left the church, people were bringing her bags and bins. Once, while she was at the grocery store, someone spotted her truck in the parking lot and left some items in the bed. “We live in a really generous community here,” she says.

After a year and a half, Kimberly has secured a warehouse – with no heating, air conditioning or toilets – to store all the donations. Parents could simply walk into the warehouse, grab what they needed, and leave. “But as we grew, companies donated and people gave us money, we needed a system in place,” she says.

At first, their system consisted of a pink filing box where families filled out a form with the date and the items they were packing. When Covid hit and the host family’s closet moved to a new space with heating, air conditioning and toilets, a volunteer started the Google form. It “was so amazing,” Kimberly says, because multiple volunteers could access it. But soon the form became too cumbersome.

And that’s where Pranav stepped in, investing the time to develop a data management app that “can be adjusted to meet the needs of any charity,” he says. The system “reduces redundancy” by keeping track of items requested by families, recording services provided by the nonprofit, and facilitating data collection.

“It’s a more streamlined process that removes clutter and centralizes information,” he explains. “These organizations often fail to obtain grants due to a lack of data. This allows them to increase their reach.

Kimberly explains that now parents will simply log into their account and enter the information for each child. “It will help us better report to donors and apply for grants,” she says.

Pranav’s father is a chemist and his mother is working on her master’s degree in computer science. He has a younger brother. “I love science, computer science and math, and my extracurricular activities revolve around that,” he says. He enjoys competing on his school’s math and computer science teams and volunteering through Madison City Schools Beast Academy teaching advanced math to young children. He is the captain of the Science Olympiad team.

Even in college, Pranav seemed destined to do great things for his community. His team won a national award in the eCybermission web competition in which students choose a real-world problem and work out a scientific way to solve it.

Unsurprisingly, he hopes to study computer science in college and one day work in a software-focused field.

The as-yet-unnamed app he developed for the North Alabama Foster Closet could be used by other similar organizations, Pranav says, who could adapt it to their own needs.

Earlier this year, Pranav impressed the board with an update on the program. He will always have access “on the back-end” and can work on the system if necessary. “I’m done now,” he said. “We’re just testing and making sure there are no bugs.” So far, a few families have tried it and recommended adjustments.

“Apparently I signed him for a lifetime commitment,” Kimberly laughs. “He’s an amazing youngster. To think he started this at 15! He’s such a gentle spirit with a kind and generous heart. Very inspiring.”

___

To learn more about North Alabama Foster Closet, visit northalabamafostercloset.com. Check out the organization’s Amazon Wishlist at https://a.co/9vk0wEA.

]]> DataStax adds real-time data streaming to managed AstraDB service https://ame-net.com/datastax-adds-real-time-data-streaming-to-managed-astradb-service/ Thu, 24 Mar 2022 10:24:00 +0000 https://ame-net.com/datastax-adds-real-time-data-streaming-to-managed-astradb-service/

DataStax adds low-latency change data capture (CDC) capabilities to its database as an AstraDB NoSQL service.

Based on the Apache Pulsar open source project, DataStax seeks to help customers integrate low latency data streaming capabilities into their applications. Originally built at Yahoo, Pulsar has evolved into an open-source event streaming tool to rival Apache Kafka, which processes and delivers database changes in real time and distributes results to your choice of location. ‘landing.

“You can stream big data changes coming into your database to other destinations such as Snowflake, BigQuery, C++, Java, or any other programmatic client,” said Chris Latimer, vice president of product management at DataStax.

The big promise here is that customers can more easily combine operational and streaming data in a managed environment.

“What seems unique is that DataStax offers CDC functionality for a non-relational database, whereas most CDC options are for relational databases. CDC is usually a feature of either the [database management systems]or a system designed for the detection of close data changes,” said Carl Olofson, vice president of research at IDC.

There is currently a healthy debate in the industry that Apache Pulsar, which is a server-to-server messaging service, is better than its more established rival Apache Kafka, due to its cloud-native design and its suitability for more distributed applications.

DataStax sees compelling use cases for this new CDC feature for high-data customers in the crypto, travel booking, and software-as-a-service industries, according to Latimer.

“DataStax’s cloud-native CDC companion service is aimed at enterprises that have global deployments and need to rapidly scale their database,” said Doug Henschen, principal analyst at Constellation Research.

Last year, in September, DataStax made AstraDB available on all major cloud providers – Microsoft Azure, Amazon Web Services and Google Cloud Platform – following the announcement of a serverless NoSQL Atlas DBaaS in July by its rival MongoDB.

Copyright © 2022 IDG Communications, Inc.

]]>