Earnings Call Transcripts
Operator: Thank you for standing by, and welcome to Netskope, Inc. Fourth Quarter and Full Year Fiscal 2026 Earnings Conference Call. At this time, all participants are in a listen-only mode. After the speakers' remarks, there will be a question-and-answer session. I would now like to hand the call over to Michelle Spolver, Chief Communications and Investor Relations Officer. You may begin.
Michelle Spolver: Good afternoon, and thank you for joining us today. With me on the call are Netskope, Inc. CEO and Co-Founder Sanjay Beri and CFO, Andrew Del Matto. The press release announcing our financial results for the fourth quarter and full year fiscal 2026 was issued earlier today and is posted to our Investor Relations website at investors.netskope.com along with a supplemental presentation. Before we begin, let me remind everyone that some of the statements we make on today's call are forward-looking, including statements related to our guidance for the first quarter and full 2027 fiscal year, growth opportunities, competitive position, and the impact of AI adoption. These forward-looking statements are subject to known and unknown risks and uncertainties, which could cause actual results to differ materially from those anticipated by these statements. Additionally, these statements apply only as of today, and we undertake no obligation to update them in the future. For a detailed description of risks and uncertainties, please refer to our SEC filings as well as our earnings press release. Finally, unless otherwise noted, all financial metrics we discuss on this call other than revenue will be on an adjusted non-GAAP basis. We have provided reconciliations of these non-GAAP financial measures against the most directly comparable GAAP financial measures in our earnings press release. Now let me turn the call over to Sanjay to discuss our business and high-level Q4 financial performance. Thanks, Michelle.
Sanjay Beri: Welcome, everybody, and thank you for joining us to discuss Netskope, Inc.'s fourth quarter and fiscal year 2026 results. We ended the year on a high note with results that exceeded our guidance across all key metrics. Our focus on delivering a market-leading platform for networking, security, and analytics for the modern world of cloud and AI is resonating well with customers and driving both new and expansion business. Strong global execution resulted in robust fourth quarter results, highlighted by record net new ARR of $57 million and ending ARR of $811 million, representing true organic growth of 31% year over year. Revenue in Q4 grew 32% year over year to $196 million, and revenue for the full fiscal year 2026 also grew 32% to $709 million. We continue to leverage the investments we have made in our Netskope One platform and NewEdge Global Private Cloud network to drive efficient growth, which is reflected in the five percentage point improvement in operating margin in Q4 and an 18 percentage point improvement for the full fiscal year 2026. We are also very pleased to generate $12 million in free cash flow for fiscal year 2026, marking a notable milestone of Netskope, Inc.'s first ever year of positive free cash flow. We also saw a strong mix of both new logo growth and customer expansion across key verticals and geographies. The average number of products per customer increased to 4.4. Customers are solving key use cases through the adoption of our Netskope One platform of 25 security, networking, analytics, and AI products. This is reflected in our net retention rate of 116% and 22% growth in customers with over $100,000 in ARR year over year. Andrew will give more color on our Q4 and full year fiscal 2026 financials in a few minutes. We have had many innovations, go-to-market, and operational highlights during the quarter, and one theme that threaded prominently across all of these was AI. Netskope, Inc. is uniquely positioned as a significant beneficiary of the AI super cycle because we have engineered a unified AI-native fabric that eliminates the legacy trade-off between performance and security. We have organically built the Netskope One platform as the intelligent edge—an inherently adaptive architecture where our native AI fluency and active context are seamlessly integrated into our global infrastructure defined by its performance, resilience, and dynamic orchestration. I would like to spend some time today walking through the four pillars of our AI strategic framework, which demonstrate why Netskope, Inc. is the essential engine powering the scale of the modern AI enterprise. First, Netskope, Inc. is an AI-native platform with sovereignty and privacy by design. From day one, we were engineered as an AI-native platform. Our competitive moat is architectural, not a single bolt-on feature. Since our inception, we have leveraged and integrated AI as foundational across our platform. In the early days, we used shallow and deep learning, and today, we further augment these capabilities with generative AI models to deliver maximum impact for our customers. Every implementation follows our core principle of privacy by design. We believe intelligence should never come at the cost of data privacy or sovereignty, and we operate a library of more than 190 proprietary purpose-built, specialized AI models optimized for security and network performance. This quarter, we continued to enhance those models, and our AI labs released additional models that are used in our new AI security products. Unlike competitors using just generic LLMs, Netskope, Inc.'s intelligence is purpose-built, high speed, and hyper accurate. The second key pillar in our AI strategy is enabling and securing AI in real time. While legacy and first-generation SASE vendors perform so-called post-event autopsies with out-of-band scanners, Netskope, Inc. provides real-time, in-line AI security for corporate and shadow AI. We do not just see that an AI transaction is happening. We understand the data and intent within it and take real-time granular action. Let me explain. Most solutions simply see a connection. Netskope, Inc. understands the deep interaction. Because our proxy is natively AI fluent, we possess the active context to determine dynamically in real time the specific AI app instances, activities, data, and more. We also determine the semantic intent of prompts and responses in real time and enforce in-line policy. Our Netskope One Agentic Broker—one of a number of new AI products we announced earlier today—also seamlessly applies this to all MCP transactions, either sanctioned or unsanctioned. This is why at Netskope, Inc., we are not just observing and enabling the AI revolution. We are the engine generating the high-fidelity data that secures it. In today's world, the most valuable asset is not just the AI model. It is the unique real-time data and transaction telemetry that understands the intent and lineage behind every interaction. Netskope One generates a vast and proprietary set of AI-fluent metadata and data for trillions and trillions of transactions a month that traditional security tools simply cannot see or generate, decoding the complex language of AI agents, generative AI apps, AI tools, and cloud JSON in mid flight. But while the power of our unique data is a competitive moat, our purpose is singular—the dynamic protection of our customers and their nonhumans and humans. We leverage this unprecedented visibility to protect our customers, ensuring that they can innovate at the speed of AI without ever compromising the integrity or sovereignty of their most sensitive information. We apply in-line decisiveness—or put another way, we make granular go or no-go decisions in flight. This allows us to stop sensitive data from entering prompts and block poisoned AI responses or injection attacks before they ever reach a customer's environment, protecting users, apps, and autonomous agents alike. And our Netskope One AI Guardrails, also announced today, take this to the next level. These deep contextual controls are necessary for enterprises to move from prohibiting AI to enabling AI with confidence. We do not just see more. We protect better, turning our unique data into the ultimate foundation of trust for the agentic era. Our third AI strategy pillar centers around providing differentiated and unmatched performance and resilience through our NewEdge AI infrastructure. AI transactions are uniquely sensitive to latency. Legacy networks create a latency tax that breaks AI performance. Conversely, our NewEdge infrastructure—the world's largest, high-performance private security cloud—is the AI Fastpath. It is the most resilient, high-speed highway for AI transactions globally, including AI applications and agents hosted in public, private, and new clouds. It is also an agile edge. Our infrastructure is defined by performance, resilience, and dynamic orchestration. We process complex security at the edge, closest to the agent, app, or user, reducing lag for secure real-time inference while allowing the network to dynamically adapt to the high-velocity traffic patterns of the AI era. And lastly, our fourth AI strategy pillar is a platform built organically and natively for the agentic economy, with an AI-fluent proxy and for autonomous operations. The perimeter has shifted from being filled with people to entities. Netskope, Inc. uniquely addresses this with a platform that specifically speaks the language of AI. As we mentioned, our architecture and native AI-speaking proxy innately understand APIs and JSON. This AI-native visibility enables hyper-granular zero trust control over AI transactions that other vendors simply cannot see. Through our just announced Netskope AI Gateway, we extend this enforcement anywhere—public cloud, private data center, or the edge. We also just released our first autonomous Netskope AI agents, which have already been met with exceptional customer feedback. One of the areas our agents will address will be automating operations. These classes of agents from Netskope, Inc. automate complex network and security tasks, drastically reducing the human-in-the-loop requirements for global enterprises. Our recently released ZTNA AI agent has been received very well in this area. And finally, we offer universal governance. By marrying our market-leading data protection with our newly introduced Netskope AI Guardrails, we secure and moderate for acceptable use all communications, whether via Model Context Protocol (MCP) via our Agentic Broker, prompts, or ShadowAI, for humans and nonhumans alike. Because of these unique and highly differentiated four pillars of our AI strategic framework, many of the world's most sophisticated enterprises choose to secure, accelerate, and analyze their AI transactions through Netskope, Inc. As a result, Netskope, Inc. has become one of the most definitive sources of enterprise AI data usage and trends in the world. We were pleased to announce today the Netskope AI Index, which consists of a first-of-its-kind interactive view of real-time AI usage across the world—data covering virtually every country, industry vertical, and company size—providing a granular view of AI adoption and attributable intelligence that positions Netskope, Inc. as the authority that customers and the public can cite when describing the real-world trajectory of the AI economy. We have also kept our foot on the gas innovating across our Netskope One platform of security, networking, and analytics products, and in other related areas. Let me share a few recent examples. On the data security front, we strengthened our competitive edge with the introduction of Netskope One Data Lineage. Data Lineage enables security teams to track and visualize the movement of sensitive data across their entire organization through various levels of origin, usage, and access, including visibility into when that data propagates or evolves. We also introduced new capabilities and integrations to improve secure connections to enterprise applications from unmanaged or BYOD devices. Enterprise browser support was expanded to a new range of iOS and Android mobile devices, while deeper integration with our Remote Browser Isolation and Private Access solutions provides a range of highly secured deployment models to enable users to connect to private apps through web browsers on their unmanaged devices. And on the networking and infrastructure side, we delivered our DNS as a Service that enables customers to point their DNS traffic to Netskope, Inc. for resolution, which can then use our DNS content filtering and security to provide secure access for often overlooked and unprotected use cases like guest Wi-Fi access. Our continued innovation expands our robust Netskope One unified platform of 25 security, networking, analytics, and AI solutions, providing more opportunity to land and expand with customers. We are an organically built, truly integrated, modern platform for the AI and cloud era. I want to emphasize this point about platforms in the context of what we repeatedly hear from customers. They are telling us that while they desire truly unified platforms over a slew of point solutions, they also are not seeking a single platform for all their security and networking needs. The unification they desire in a platform is what Netskope, Inc. uniquely delivers. We built our AI-native Netskope One organically, not through M&A, which results in a disjointed, cobbled-together solution and often frustration for customers. Our 25 products share one common code base, one engine, one console, and one network, providing both better efficiency and a seamless customer experience. And our NewEdge private cloud runs Netskope, Inc.'s full stack of products at high speed at all of our more than 120 locations. To illustrate how our customers are adopting our Netskope One platform, let me pivot to our go-to-market accomplishments during Q4. We saw significant customer wins across verticals and geographies, with customers turning to the Netskope One platform to enable AI adoption, modernize their security and infrastructure, consolidate vendors, and replace legacy and first-generation cloud security products. I will touch on some key wins and expansions across common use cases. First, customers are choosing the Netskope One platform for modernization to facilitate secure access from human and nonhuman identities to their AI ecosystem, including generative AI apps and LLMs, cloud apps, web apps, and private apps. One notable win was a large global manufacturer who sought better data visibility and control and the ability to safely allow the use of AI in cloud. They chose six Netskope, Inc. products to secure generative AI and cloud access, protect their data, and improve their network performance. We also landed a top regional health system in the US who initiated a modernization selection process to replace its legacy security and network infrastructure after previously suffering a severe and costly breach. We shined in this competitive bake-off, and the customer purchased 11 products within our Netskope One platform, including our next-generation SWG with AI controls, ZTNA Next, Advanced DLP (which includes proprietary AI models from our AI labs team), Borderless WAN, Cloud Firewall, Enterprise Browser, and other products to replace legacy firewall and networking products. Customers also continue to turn to Netskope, Inc. for our superior unified data protection. For example, one of the largest hotel companies in the world with operations in nearly 150 countries was a notable unified data expansion win during the quarter. This client needed continuous real-time visibility into the full breadth of its data security posture. They also needed to confidently manage and protect sensitive data across cloud, on-premises, and hybrid environments including AI and cloud data stored. They purchased Netskope, Inc.'s DSPM solution and are now using 14 products in the Netskope One platform across its organization. Many customers also deploy our suite of Netskope One to modernize their infrastructure, including replacing legacy VPNs with our zero trust architecture, branch firewalls with our Borderless SD-WANs, and migrating their networks to our high-performance private cloud, especially as they adopt more AI and cloud where performance becomes even more critical. For example, we landed a Canadian gaming company that needed to modernize their network for AI and cloud, provide secure remote access to their global workforce, and meet governmental regulatory requirements. This customer replaced legacy hardware products at more than 5,000 locations with Netskope, Inc.'s Borderless SD-WAN and ZTNA Next secure access solutions and purchased our Next-Gen Secure Web Gateway, with added AI protections, for secure generative AI usage. In another win, a European government chose us to modernize its legacy on-prem security infrastructure and drive data sovereignty. They wanted to consolidate multiple point solutions into a single centrally managed platform to protect sensitive data while meeting strict government regulatory requirements. The organization purchased our comprehensive Netskope SSE platform covering security for AI, web, cloud, and data. Finally, our global fast and resilient NewEdge private cloud network makes us particularly well suited to deliver globally distributed and highly regulated customers the data sovereignty and regulatory compliance that they require. For example, we landed two of the largest banks in Africa in two separate deals. One was a unified data protection deal that we successfully baked off against a primary competitor. The other was a network modernization deal where we replaced the same competitor with Netskope, Inc.'s market-leading SSE solutions and higher-performing NewEdge private cloud. Our local data centers in Africa were a driver for the wins, as we are able to deliver superior performance and data sovereignty to these customers. The geographic and vertical diversity of these customer wins and use cases is a testament to our clear technical differentiation and disciplined execution across all regions. Our new customer wins were all competitive bake-offs against primary competitors, where the capabilities of the Netskope, Inc. platform proved itself in extensive POVs. We continue to land with multiple products and have seen strong growth in multiproduct adoption across our customer base. As of the end of Q4, 56% of our customers were using four or more Netskope One products, and 27% were using six or more products. I am pleased with the strong performance of our go-to-market team across the globe. Our newly hired sales reps are ramping, and our tenured reps are delivering strong productivity. We put great leaders in place and recently filled some of our remaining key sales leadership positions, including the appointment of Joe Welch to lead our US public sector vertical, an area where we are underpenetrated but have strong opportunities. Joe is a seasoned veteran with decades of public sector experience in this space. We are also continuing to hire highly talented reps in all geographies, many of whom are joining us from key competitors across our space. I just returned from our annual sales kickoff, and I can tell you that our team's excitement, energy, and conviction is truly palpable. Our momentum is only building. As part of our comprehensive go-to-market strategy, we also continue to strengthen our relationships with system integrators and strategic partners. During Q4, we partnered with the largest GSI in the world on a major enterprise deal in the energy sector, supporting digital transformation and zero trust for approximately 80,000 employees. Other recent engagements include a large government defense customer in Asia Pacific and a major health care customer in North America. This key partner holds over 150 certifications on the Netskope, Inc. platform, and their support extends our global operations. This is just one example of how we are partnering well on large-scale, partner-driven enterprise transformations globally. On the technology partner side, Netskope, Inc. also recently achieved the Amazon Web Services Security Competency status for AI Security. This competency assures AWS customers that Netskope, Inc. has met technical and quality standards to deliver best-in-class solutions for securing AI workloads across AI security use cases. In closing, I want to reiterate that we are in the early innings of an AI super cycle that is exposing a fundamental flaw in legacy and first-generation SASE architecture. Legacy security acts as a latency tax on AI performance, forcing enterprises to choose between safety and speed. We believe the next decade will be defined by a structural shift towards an intelligent edge architecture built specifically for an autonomous agentic economy. Netskope, Inc. is uniquely positioned for this era for three reasons. First, we have an architecture for the future. While legacy vendors proxy the past, Netskope, Inc. is the distinctly AI-native proxy with innate fluency to secure the languages of the future—APIs, JSON, and the emerging protocols like MCP. Two, we also scale without friction. We have eliminated the security tax. Our AI Fastpath infrastructure has unique capabilities to perform complex, real-time security at the speed of AI inference. And three, finally, we are an intelligence moat. Our advantage is rooted in active context and the real-time AI-fluent proprietary data we generate. While others count traffic, we understand intent. Our proprietary data from trillions of real-time transactions, validated by the Netskope AI Index, makes us the indispensable source of truth for the AI economy. Sitting squarely at the intersection of cloud, AI, networking, and security, Netskope, Inc. has a massive market opportunity, which is projected to grow to at least $149 billion by 2028. We have just begun to scratch the surface and look forward to what is to come. The plumbing of the AI era is being laid today, and it will take many years to fully realize. By unifying high-speed performance with deep semantic intelligence, Netskope, Inc. is not just selling a platform. We are providing the essential adaptive fabric to the modern AI enterprise. For the next, 2026 was an incredible year of growth and expansion for Netskope, Inc., and our IPO in September was just the beginning of our public company journey. We see an incredible path ahead as we attack the AI security and networking opportunity with exciting new products, continue to bring more customers onto our platform, expand business with existing ones, drive further innovation, ramp our sales team, and drive awareness globally. I am proud of what we have accomplished—particularly our first full year of positive free cash flow generation and industry-leading ARR growth at scale. I look forward to seeing many of you at RSA in a few weeks, where we will demonstrate and share more about our AI strategy and new products, and engage in other ways in the months ahead. With that, let me now turn it over to Andrew to provide financial details on the fourth quarter and our outlook for the first quarter and fiscal year 2027. Andrew?
Andrew Del Matto: Thank you, Sanjay, and hello, everyone. As Sanjay shared, Netskope, Inc. had a very successful fourth quarter, closing out the year on a strong note. We continue to deliver significant growth as our investments in NewEdge, new product innovation, and our go-to-market organization continue to pay off. Before I share Q4 and fiscal year 2026 results, let me remind you that all financial comparisons are on both a year-over-year and non-GAAP basis unless stated otherwise. For the full year 2026, we are proud of what we accomplished. We delivered revenue of $709 million, or 32% growth; ARR of $811 million, up 31% year over year; net new ARR of $193 million, up 35% versus fiscal 2025; operating margin improvement of 18 percentage points while continuing to invest in our innovation and go-to-market engines; and we generated $12 million in positive free cash flow, which marks Netskope, Inc.'s first fiscal year of positive free cash flow and an improvement of $163 million over fiscal 2025. This translates to a 30 percentage point free cash flow margin improvement year over year. Moving on to Q4 results. ARR grew 31% to $811 million at the end of Q4. As Sanjay noted, we also had a record quarter for net new ARR of $57 million. Q4 revenue grew 32% to $196 million. We also experienced strength across geographies. In Q4, revenue in the Americas grew 32%, EMEA increased 36%, and APJ grew 26%. Our teams executed well, and our investments in our sales organization are paying off. In terms of customer metrics, the number of customers generating more than $100,000 in ARR in Q4 grew 22% year over year to 1,531. Enterprise and large enterprise customers are our focus, and more than 85% of our ARR comes from $100,000-plus ARR customers. Note that the average ARR from this customer cohort grew to more than $450,000 per customer. This is indicative of our success in both expanding our installed base and securing significant new enterprise deployments. Our Q4 net retention rate, or NRR, was 116%, while our churn and downsell rates remained at historic lows. Composition of deals varies quarter by quarter, but our consistently strong NRR reflects our customers' ongoing confidence in Netskope, Inc.'s platform and expansion of their deployments as they consolidate vendors and modernize their infrastructure. Customers view Netskope, Inc. as a long-term strategic partner given our commitment to innovation and ability to deliver products that solve the complex and evolving security challenges in the cloud and AI era. In addition to NRR, we look at multiproduct adoption to demonstrate our expansion opportunity within our customer base. As Sanjay mentioned, at the end of Q4, 56% of our customers were using four or more products versus 48% a year ago, and 27% were using six or more products, up from 22% a year ago. We are pleased with this progress and believe our 25-product Netskope One platform gives us a clear opportunity to continually expand within our growing customer base as they consolidate more of their security and networking stack with us. Moving on to the rest of the income statement. We saw the benefits of Netskope, Inc. being built to scale. Gross margin was 76%, an increase of approximately five percentage points from Q4 last year. Our gross margin expansion is being driven by the efficiency of our NewEdge architecture, which is generating better unit economics as we scale. Q4 operating expenses totaled $171 million, up approximately 3% sequentially. Operating margin improved five percentage points year over year to negative 10%. R&D expenses improved 100 basis points year over year to 36% of revenue, driven by earlier investments in a common data platform and hiring in high-talent, cost-efficient locations. Sales and marketing expenses remained flat at 40% of revenue as we continue to invest in quota-carrying sales reps. Our consistent improvement in gross margin and operating margin reflect the operating leverage we have unlocked as our earlier strategic investments in infrastructure and talent begin to compound. Net loss per share was $0.04 using 395 million weighted average shares outstanding. As a reminder, our non-GAAP EPS excludes the change in fair value of the convertible notes we issued prior to our IPO. Fully diluted share count using the treasury stock method was approximately 503 million shares as of 01/31/2026. We generated $4 million in free cash flow in Q4, representing a 2% free cash flow margin. Note that this was driven by our laser focus on efficiencies and in the first year of our transition to annual billings. We are pleased with our ability to drive positive free cash flow, as this demonstrates the leverage inherent in our model. While we will continue to realize the benefits of being built to scale on margins and cash flow, our path to sustainable positive free cash flow is not expected to be linear. The timing of cash collections can vary quarter to quarter, and we expect to continue investing in the business for long-term growth. And finally, we ended the fourth quarter with $1.2 billion in cash, cash equivalents, and marketable securities. Before I share our guidance for the first quarter and fiscal year 2027, let me briefly outline some factors that should be considered. We are continuing to make investments in our business, most notably in R&D and sales and marketing. We are continuing to hire sales reps across the globe to support our expanding market opportunity aligned to the AI super cycle that Sanjay noted. At the same time, we are leaning further into our AI roadmap and expanding our AI-native Netskope One platform with additional products to support our customers' AI adoption journeys both today and in the future. While we are adding AI engineers and data scientists to drive further innovation in this important emerging area, we are also empowering our teams with AI tools to drive efficiencies in development and other areas of our business. We expect to see most of the impact from these investments to operating margin during the first half of the year, leading to improving operating margin in the second half of the year. As we look at gross margin, we are on track to achieve our long-term target of 80%. With the foundational investments we have made in NewEdge, we now expect margin gains to come through top-line growth and continued optimization. Now that gross margins improved into the mid-seventies, we expect progress from here to be more gradual and may not follow the linear step function seen in recent quarters. Also, as we have discussed in the past, we are shifting customers to annual billing on multiyear contracts where possible. Billing annually will improve predictability and consistency of our cash flows. I am pleased to highlight that this transition is occurring faster than we originally expected. While it is difficult to predict exactly how this will impact future free cash flow, we expect to see the most significant impact in Q1 with negative free cash flow in the range of $50 million to $60 million. We expect that to improve in the second quarter, return to positive free cash flow during the second half of the year, and to end the full year with positive free cash flow in the range of 2% to 4%. We will continue to provide you with quarterly updates as we progress throughout the year. We began this billing transition a year ago and expect to see the bulk of the impact this year. And finally, we believe we are uniquely positioned as a significant beneficiary of the AI super cycle due to our unified AI-native fabric that eliminates the trade-off between performance and security. At the same time, we are early in the year, still have a large portion of our sales reps ramping, and we are continuing to establish our reporting cadence as a public company. And while AI and cloud adoption are driving significant interest in platforms like Netskope, Inc., we recognize that macro and geopolitical factors have the potential to impact customer spending plans. We have built our guidance with these factors in mind. Let me now provide our guidance for Q1 and fiscal year 2027. As a reminder, these numbers are all non-GAAP unless stated otherwise. For Q1 fiscal 2027, we expect revenue in the range of $197 million to $199 million, representing growth of approximately 26% at the midpoint; operating margin of approximately negative 16%; net loss per share of $0.06 to $0.07 using approximately 405 million weighted average common shares outstanding. We expect to see the largest free cash flow impact of our transition to annual billings in 2027 with much of that impact in Q1. As I mentioned, we expect negative free cash flow in Q1 of $50 million to $60 million. For the full year fiscal 2027, we expect revenue in the range of $870 million to $876 million, representing growth of approximately 23% at the midpoint; gross margin of approximately 77%; operating margin of approximately negative 10%, gradually improving from negative 16% in the first half of the year; net loss per share of $0.19 using approximately 415 million weighted average common shares outstanding; free cash flow margin in the range of 2% to 4%. Note that the annual billings transition is estimated to reduce our free cash flow margin by approximately six percentage points, which is reflected in this guidance. As noted earlier, we expect that to improve in the second quarter, return to positive free cash flow during the second half of the year, and end the year with positive free cash flow. We have highlighted these modeling points in the appendix of our investor presentation. In closing, we remain confident in our ability to execute on our long-term strategy and innovation, driving strong and durable revenue growth and capturing share of our expanding opportunity. We remain focused on prioritizing disciplined execution and strategic investments that strengthen our competitive advantage and continue to drive growth and margin expansion. Innovation drives our flywheel for growth. As such, we will continue to invest in data and AI engineers while utilizing AI to drive efficiency and product velocity. We will also continue to invest in go-to-market while remaining fiercely committed to delivering profitable growth. Thank you for your time today. With that, I will turn it over to the operator for Q&A.
Operator: Thank you. To withdraw your question, please press 1-1 again. We will now open for questions. Our first question comes from the line of Brian Essex with JPMorgan. Your line is open.
Brian Essex: Great. Good afternoon. Thank you for taking the question and congrats on some solid results. Maybe one question for Sanjay and then a follow-up for Andrew. I guess, Sanjay, where would you assess that we are in the maturation cycle with respect to enterprises knowing what they need to secure AI? Are your AI security announcements ahead of the curve, or are these approaches that you are already seeing CIOs demand as they look to, kind of, you know, secure their, you know, AI estate? And then, you know, for Andrew, could you maybe just help us understand the context of the sequential revenue guide? Looks like Q1 would imply only up a couple of million dollars. So we would love to understand the puts and takes there. Thank you both.
Sanjay Beri: Yes. Great question, Brian. So I think, first of all, from an AI perspective, most organizations are in the infancy. They are in the first inning. 90% of their usage of AI is shadow AI, meaning they actually did not bring it in, their end users did. And so when you think about that concept, you harken back to this really just being very early. And so from an AI security perspective, our focus is always to skate to where the puck is going, anticipate what they will need, and deliver a best-of-breed solution to solve this problem—discover their AI, guardrail it, control it, and then enable it with precision. And so that is what these new products do, building upon our previous capabilities to enable AI. So we will share more at RSA and beyond as well on that. Andrew?
Andrew Del Matto: Yeah. Thanks, Brian. In terms of the Q1—I think you are talking about Q1 guidance—again, first year as a public company, and so we are going to remain prudent, as we have said in the past, and so there is that. We do have reps ramping, and we talked about before they tend to ramp more later in the year, let us say, and we still have quite a bit of ramping going on with terms of the reps that have come in over the last year. And then finally, there are some geopolitical, macro headwinds that probably happened over the last, you know, I would say the last couple of weeks.
Brian Essex: Right. Right. Thank you both. Very helpful on both fronts.
Operator: Thank you. Our next question comes from the line of Meta Marshall with Morgan Stanley.
Meta Marshall: Great. Thanks, and echo congratulations. Maybe for Sanjay, just in terms of, you know, are you seeing—I think during the IPO process, you kind of talked about these four main use cases that people were, kind of, coming in with. Are you seeing any changes in what either those use cases are, or as you start to expand more of the product portfolio that you are selling, just any changes to, kind of, where a majority of people are coming in? And then maybe a follow-up. Just in terms of, you know, maybe the net new ARR growth this quarter—net expansion stepping back from Q3—just any commentary on, kind of, what you saw there would be helpful.
Sanjay Beri: Thanks. Great question. So from a use case perspective, when you look at our top use cases, they were, come in to help people enable cloud and web no matter where they are. The second was securing and enabling AI. I will say that has moved up in the stack. Every conversation I have, people come to us and say, look. We already run all our AI traffic through you. We released the Netskope AI Index today—it is probably the first definitive source of worldwide AI tracking by vertical, by geo, and by size of customer. Well, that kind of shows you the amount of AI traffic traversing the NewEdge network. And so what people have come to us to say is, look. You are the Fastpath to AI. Help us secure it, enable it, guardrail it, and let us say yes to it. And so that is a top, top use case that they are coming to us with, and that has been elevated. Obviously, the other ones—remote access, modernize my infrastructure, converge, consolidate, simplify my network security app—all of those are still top of mind, but definitely the AI one has been raised. And so we are very excited about that, to be blunt, because we feel like, hey. This is what we were born for. Right? Our proxy is really a JSON, API, MCP-fluent proxy. Start with cloud and now AI. It is sort of a one-two punch in a good way for us. So we are very excited, obviously, about what is to come, to be blunt, in the many, many years because we are early, obviously, in the AI super cycle. As far as net new ARR, we had, obviously, a high comp in Q4 of last year. You can see that as you kind of metric it and you watch through that growth. We are, obviously, happy to record the highest net new ARR we have ever had. You saw the growth in our customers of over $100,000 in ARR, right, in 23-plus percent, and, obviously, strong upsell as well. And so for us, the other big point to remember is we really started hiring and ramping our reps mid year, so beginning in Q3 last year. It takes about 12 months for them to ramp to full productivity, and, and so that is another big piece for us that we continue to drive.
Operator: Great. Thanks. Thank you. Our next question comes from the line of Robbie Owens with Piper Sandler.
Robbie Owens: Great. Appreciate you taking my question this afternoon. Wanted to ask more high level just around revenue model. And as you think forward—and I know it has been disclosed here—you are primarily a seat-based model. And obviously, there are some concerns in the market what seat-based models look like going forward, especially in light of all the recent layoffs. So as you add new modules and new capabilities, do you see that shifting more either towards traffic or capacity or things that will—it will be an underlying seat-based model that you are protected by adding more modules on top. So would just love some color. Thanks.
Sanjay Beri: Yeah. It is a great question. So when you look at what we do, we run the traffic for most enterprises. We run everything. All their generative AI traffic, all their agentic traffic, their cloud traffic, their on-prem traffic—it goes through us. The reality, though, is there is no free lunch on our network. And so if you are going to run users through our infrastructure, which is what obviously people do, you pay for that by user. If you are going to run agentic traffic, right, whether it is server-side or client-side, whether you have an AI agent, you pay by transaction. And so all of the new products we released today, they are charged by transaction. What is a transaction? It is a prompt and a response. Right? That is kind of the token for the agentic economy, and that is how we charge. So no matter what people run and what that balance is over time, we are going to make money off that. And so you will see, and you have already seen, four new products today—all transaction-based—which essentially maps to what you can think about as tokens.
Operator: Thank you. Our next question comes from the line of Gray Powell with BTIG. Your line is open.
Gray Powell: Okay. Great. Thanks for taking the question. Yeah, so maybe one on the product side. So one of your larger network security peers, they appear pretty bulled up on the potential for improved demand in the SD-WAN markets and the opportunity for legacy replacement this year. Netskope, Inc. also often receives high marks on WAN capabilities. So I am just interested—what are you seeing in your pipeline, and then how often are you having discussions where both security and networking are buying centers that are involved in deals?
Andrew Del Matto: Yep. Great question. So first of all, you are right. Like, in an organization, when you map out the structure, you have a CIO and you have the security leader and the infrastructure ops leader. You also now have an AI leader, and we often train all our reps—go after that square. You have to hit all four. Now where buying decisions are made, it can be in one or it could be multiple. But we obviously hunt across all of those. For us, we are a networking and a security company for the cloud and AI era. Right? And so we think about consolidation of both. The SD-WAN—what does it do? It is for speed. It is for performance. It is for resilience. And that is how we view it. And so we offer it in software form factor on your endpoint. Right? You can put it in your infrastructure. And we have seen great growth in it. You saw a great win we had in a very large distributed organization where they combined our SD-WAN as a smart on-ramp to our NewEdge network and all our security functionality. And so that concept, often called unified SASE by analysts, for us, we can deliver on that. And so what do I see in the future? Well, look at agentic traffic. The key is that agentic traffic is going to come from a user working remote. It is going to come from an oil rig, which is running AI on it. It is going to come from agents, right—many of them running on servers. All of that traffic needs to be accelerated. And that is really what the AI Fastpath is. We are the best path for agentic and non-agentic traffic, whether you are doing inference or beyond. So SD-WAN is just one small part of that fast story.
Gray Powell: Okay. Thank you very much. That was helpful. Thank you.
Operator: Next question comes from the line of Matthew Hedberg with RBC. Your line is open.
Matthew Hedberg: Great. Thanks for taking my question, guys. Andrew, I think in your prepared remarks, I am curious—was that you said deal composition can change from quarter to quarter, sort of the reason why NRR ticked down by a couple 100 basis points? Or just, I am just trying to get a little more clarity on that element.
Andrew Del Matto: Well, first of all, we view a 116% NRR as very strong, to be frank. I think anything in the mid to upper teens is something we would be very happy with, Matt. But look. NRR does vary quarter to quarter. Some quarters, we have more upsell. Some quarters, we have more new logo revenue. You know what? I think we have said that before. And note that Q4 a year ago was one of the strongest, if not, you know, a record quarter from an upsell perspective—Q4 of FY '25. That being said, looking forward, we have a large installed base. Average customer has, I think, 4.4 products. We have 25-plus products, four new products announced today—again, as Sanjay said, transaction-based—so a lot of white space and, you know, hopefully, a lot of upside there. And just want to mention that downsell and churn remain at historic lows. So retention remains very strong.
Matthew Hedberg: Got it. Thanks, Andrew.
Operator: Thank you. Our next question comes from the line of Brad Zelnick with Deutsche Bank. Your line is open.
Brad Zelnick: Great. Thank you so much for taking the questions. A lot of good information that you have revealed, you know, in these results. I have got one for Sanjay, one for Andrew. Sanjay, you spoke to a lot of this in your remarks, but I just want to hit it head on. It is great to see the unveiling of Netskope One AI Security today, and I think there is consensus that network traffic will grow exponentially as AI agents are rolled out into production. My question is, with the massive throughput requirements that agentic east-west traffic may demand, why is SASE and more specifically Netskope, Inc. best positioned to secure this traffic versus maybe a virtual firewall vendor? And then just quickly for Andrew—Andrew, just why is the shift to annual billings happening faster, and should we expect to see that result in maybe an unexpected benefit to ARR and revenue as you get better pricing?
Sanjay Beri: Thank you very much. Thanks for the question. So when you look at agentic traffic, what is, like, an AI agent doing, and what are people most worried about it doing? Well, an AI agent unleashed will go access your endpoint. Guess what? Netskope, Inc.—we monitor that. We have our endpoint data protection. It will go access your cloud apps, right, over the Internet. That is what we do. We monitor it, understand what are they accessing, restrict what it can access dynamically, whether it is a shadow agent or not. Your on-prem data—that is what our AI-enabled ZTNA does. And so I guess the summary is when you look at what Netskope, Inc. does, we have a sensor that sees all traffic that goes back on-prem, to your cloud, to your AI applications—no matter where it goes—to a website. We also have a sensor on the endpoint where we do our data protection and beyond. And then we also have a sensor which looks at all out-of-band activity, right, when you look at our lineage around understanding how these applications work, with CASB and beyond. And so we are in this unique spot where the world is about—do you have unique data? Can you generate proprietary data that no one else can see or has? And that is what we do. Because we are the most performant, largest cloud private network, because we have the ability to interpret this data at a much more granular level, we understand the agentic interactions at that granular detail. And as a result, our policy enforcement—our 1,000 of them have chosen us to secure their agentic traffic.
Brad Zelnick: Very helpful.
Andrew Del Matto: Yeah. And, Brad, great question on the billings. Look. I would remind everybody that, you know, the billings transition provides strong predictability and consistency of both billings and free cash flow, ultimately. We added a slide 24 to help illustrate the transition and where we are. I would point everybody to the 78% growth in the future billing commitments—the future committed billings. And the interesting part about that is we can see what is coming. We can actually see the dates we bill. We can obviously model collections better. And, again, we have been free cash flow positive, and that will, obviously, tilt up later in the year. But as far as, like, why going faster? It is just really strong execution. We have been very focused on it internally. We have been inspecting the deals, so to speak, and making sure that we are communicating with the salespeople and helping them through the transition, along with our customers. And then just in terms of pricing, I mean, the way we really think about pricing is we focus on value. You know, we have high win rates, and so pricing to us is more about selling the value of our products. And quite frankly, you know, we will continue to focus on that. We have new products to offer—I think a stronger story with the new AI products coming out—and those are the things I would really look to, to strengthen the trend on pricing.
Brad Zelnick: Awesome. Thank you so much for taking my questions.
Operator: Thank you. Our next question comes from the line of Jonathan Ho with William Blair. Your line is open.
Jonathan Ho: In terms of your profitability guide for 2026, I know you talked a little bit about investments. You help us understand maybe where you see the most opportunity to place those investments? And what would be, sort of, the timeframe for us to see perhaps an inflection in growth as you spend more on R&D and sales and marketing? Thank you.
Sanjay Beri: Yep. Great question. So from an investment perspective, obviously, you have seen our yearly guide, but you also saw that we are investing upfront. And when that upfront investment is, it is really in AI—continuing to AI-enable our R&D team. So when you look at the world today that we live in, I believe that every engineer can be a 10-times engineer. And AI is not about by coding something or so on. It is about making your elite engineers 10 times more productive and 10 times more focused on architecture. And so what do you have to do to enable that? Well, you want to invest in AI orchestration. That could be to help them automate workflows, to automate their validation and testing, to automate, sort of, the rote stuff that they have to do, so they can focus on the unique part. And so that is what we are doing in Q1—investing in that in the first half, in that AI tooling. Now what you will see after that is you will see in the second half and beyond that we really do not need to ramp our R&D in terms of headcount as what you may have thought, right? We can be a lot more efficient. And so this is about laying continually the groundwork for R&D efficiency. You have got to invest a little in AI tooling, and then you see a lot of that benefit from an R&D leverage. And you will see that, obviously, as we continue our R&D percentage of revenue downwards. And so that is probably one of our bigger investments. The second is in the sales and marketing. We mentioned that really mid last year, we started bringing on more reps, and those reps take about 12 months to ramp. Well, one, not only do we continue to invest in enabling them, but we are hiring more teams, right? And we know what is in front of us in terms of the TAM for the next decade—one of the most durable TAMs you will ever find in any industry, including security, where we operate on the far right of security, right? We operate the network. We operate the infrastructure, the highway, to everything that you can think of. And so we want to take advantage of that and continue to ramp and hire from a sales perspective. But we are doing that very responsibly, with this notion of being very efficient in R&D by investing in tooling. And so that is really our upfront investments in the first half. And that is why you see what you saw from the guide on Q1, Q2 versus the rest.
Operator: Thank you. Thank you. Our next question comes from the line of Richard Poland with Wells Fargo. Your line is open.
Richard Poland: Hey. Thanks for taking my question. Just a quick one for me. I think it was Andrew—you mentioned the geopolitical, macro headwinds kind of happening over the last couple of weeks. I just wanted to clarify on that. Is that something that, you know, you are starting to see show up in demand and pipeline? Or is it just, kind of, you are observing what is going on in the macro environment, so you are taking some extra cautionary steps in the guide. Alright.
Andrew Del Matto: Fair question, Rich. I think, you know, it is something—I think we can all recognize that there have been more events in the last couple of weeks. So it is something just to consider in terms of being prudent, in our mind. To keep in mind, we have a, in terms of that area of the world, so to speak, we have a very small percentage of our business. So I do not—it is less about that and just more about, you know, what I would call kind of a more macroeconomic risk.
Richard Poland: Okay. Great. Thanks, guys. Just prudence.
Operator: Thank you. Our next question comes from the line of Shrenik Kothari with Baird. Your line is open.
Shrenik Kothari: Yeah. Thanks for taking my question. So the AI Fastpath is really, as you said, shifts focus from not just securing AI to securing at scale with AI-native fabric and the new modules that you announced. So as it pulls the conversation away from, like, traditional kind of FTE-based pay cost towards more broader discussion, can you talk a little bit about how the AI Fastpath has been progressing—your pipeline right now? And then I have a quick follow-up. Thanks.
Sanjay Beri: Sure. Yeah. It is a great question. Like, we have always believed that ultimately nobody implements security unless it has a great end user experience. In the agentic world, performance matters more. Agents talk constantly, right? They can talk at a rate that is 100 times a human. And so it accentuates the need for a fabric that can operate and perform and be resilient worldwide. If you look at our infrastructure, it is the largest private cloud in the world. It is the largest highway or airspace for AI. And so those 120-plus data centers, with our architecture and software and memory operating at high speed on all agentic traffic—that is a huge advantage for us. And what we tell customers is just try it. Just measure it. You will see a very, very noticeable performance difference. Whether you are an AI agent or you are an application, you are a user, right? You are an IoT device. And so the AI Fastpath is the next evolution of that for the AI era. You are going to a coding application, right? You are going to any of the thousands of generative AI apps you can see on the AI Index. We are going to be the fastest path to get there. We are going to handle that. We are not going to throw it on the public Internet. We are going to get you there directly. And so for us, the AI Fastpath is a big part of how we think about the agentic era—its performance, resilience, in addition to security—and we are combining them all.
Operator: Thank you. Our next question comes from the line of Eric Heath with KeyBanc. Your line is open.
Eric Heath: Hey, thanks for squeezing me in and solid finish to the year. Sanjay and Andrew, over 30%. Maybe just one for you, Sanjay, and maybe a quick one for Andrew. Sanjay, just following up on some of your comments about the customer wins in the quarter being, I think, all of them competitive bake-offs, and I think we all kind of really appreciate the static set of competitors it has been for a long time, but there are some incremental competitors out there that have popped up in the last couple of years. So curious if you could just talk to whether the competitive set—who you are bumping into in these deals—is changing at all. And then, Andrew, if I could, just any high-level guardrails you want to give us on ARR for the year would be great. Thanks.
Sanjay Beri: Great. So from a competitive perspective, we have, you know, 25 products. We just released four. One of the great things about efficiency in R&D that you have seen—obviously, R&D percentage of revenue going down, obviously the supercharging of it with AI and the need to not, you know, not to hire as many from an R&D perspective—you are also seeing velocity increase. I think I previously said that we release on average two products a year or so. Well, we have already released four-plus, and I think that trend will continue for us. And so we are very excited, obviously, about that supercharging. And as a result, because of the breadth of what we do, we do see different competitors. For example, in the data protection area—which, you know, data, frankly, is what drives the agentic world—we would have seen still some, a lot of the legacy folks, right? You cannot imagine how much legacy Broadcom Blue Coat and all the rest is out there in Symantec and Trellix and so on. Whereas perhaps in the traditional kind of web world, proxying web, you would see your competitors that you may see in a Magic Quadrant that you would expect. And then when we delve into, sort of, what I called about the AI Fastpath—the performance—you really do not see anything there because the network is obviously just very different, very unique from that perspective. And so I think, like, we hit across a cross section of competitors. But what is noteworthy is our win rate of over 80% if we get to a POC, a proof of concept, right? That has held. And so our nirvana is just get to a POC—whether it is about enabling, securing AI, securing cloud, converging, consolidating your network infrastructure. That is why we are growing our sales teams. That is why we announced the GSI partnership and that win with the largest GSI. And that is why we continue, you know, to power through in the mid market with our MSPs. It is just—that is why we went public, to be blunt: drive awareness. And so that awareness takes time. It is coming. And we definitely have the platform that, when you get to knock that door open, we will win.
Andrew Del Matto: And, Eric, on ARR, again, you know, while we do not guide, maybe I can be helpful with how to think about modeling. You know, last quarter, we ported the history. I would do the same thing. You can see that, I think, ARR was about a point below revenue growth. So, you know, I would say if I were modeling, kind of, at a point above, point below—something like that, right in that range.
Eric Heath: Awesome. Thank you, gentlemen.
Operator: Thank you. Our next question comes from the line of Shaul Eyal with TD Cowen. Your line is open.
Shaul Eyal: Thank you. Hi, good afternoon. Andrew, maybe can you talk to us about ASP patterns in light of rising memory prices?
Sanjay Beri: I will take that question. So for us, when you look, first of all, at our landing and our average ARR per deal size—you can calculate it—it continues, you know, for our customers to continue to go up. When you look at memory, I think that, in one case, people often talk about that as when you sell boxes and appliances and everything ships with memory. That is obviously not really what we do in the majority of cases. For us, it is our infrastructure. It is our network. What runs on it is our software. And so we feel good about our guide for this year in terms of incorporating what you just said. Obviously, that is a fluid environment, so we will watch that for next year. But we definitely feel good about the guidance we have given from a financial metric perspective that incorporates all of what you described. The reality is that for us, when you think about us, we process all this traffic. You can see it. You should go to ai-index.netskope.com. And when you look at that traffic, what matters there is what you do when you see it. And, ultimately, that, in many cases, is our moat. It is uniquely taking those transactions and generating very unique, granular data that can then inform your security policies, your security analytics, optimization of that, your guardrails, and so on. And so for us, obviously, we are excited about continuing to drive more into our existing infrastructure, which can more than handle what we need to drive for this year.
Operator: Thank you, Sanjay. Thank you. Our next question comes from the line of Trevor Walsh with Citizens. Your line is open.
Trevor Walsh: Great. Hey, team. Thanks for taking the question. Maybe just a quick one for you, Sanjay. Just wanted to square some of the comments that you made both in the prepared remarks and your responses to some of the questions. You said that the AI revolution is exposing legacy architectures within SASE. Is that going to result in, like, just breaking of those legacy or more just dissatisfaction, just generally, with performance? And then secondarily to that, is there some sort of leading indicator that investors could use to just determine whether or not more of that breaking or dissatisfaction is going to come once AI and agent traffic is getting to a certain point? Maybe the AI Index you just released gives us clues there. Just trying to get a sense of when we really start seeing the wheels fall off, potentially, of other players, if that makes sense.
Sanjay Beri: It is a good question. So I would look at it in two sides. One is the infrastructure and the network side, and one is security—because you kind of need both. On the infrastructure and network side, the agentic era will expose networks that were built, for example, in the public cloud, where you are going to get way worse performance. When you have more interactions back and forth, the performance difference becomes bigger, right? It became big with cloud. It will become bigger with AI. And so, one, your infrastructure. The second is, for us, we run all services everywhere—120 data centers. Everything we do runs everywhere. We do not hairpin people to a public cloud for one, to your own infrastructure for another. And so just the purity and the modernness of our architecture and our infrastructure, it leads to just better performance. And AI accentuates that. So, one, I do think the infrastructure and the network of others gets exposed. The second is, remember, since the beginning of Netskope, Inc., we have always said we are not trying to build a web proxy, right? We were building a modern API/JSON proxy. And it sounds technical, but what does it mean? The language of AI is that. The language of AI is APIs and JSON. I have a patent sitting outside my door here, which is real-time interpretation of Internet traffic at the API level. And the reality is that the AI era is about that. How do I say to someone that, hey, you can use a personal instance—or you cannot use a personal instance—of Gemini, but you can use a corporate version? And if you want to send sensitive data there, you can only do that with a corporate version. How do I have all these policies that guardrail and enable people to use AI, yet satisfy the business policies they want? You need something that truly understands the new language of the Internet, which is really what AI accentuates. And so for us, one of the engines to our car is a high-speed, distributed, in-memory, API/JSON proxy. That is unique. And so I remember this customer who came to me and said, Sanjay, I bought a SASE. I bought it, and it was working. But then I started adopting AI and cloud, and I have to bypass 70% of all traffic because all it can do is block the app or allow it. I do not want to block AI. I do not want to allow it either. I want something more granular. Right? And that customer moved all their traffic to Netskope, Inc., right? It is close to 100,000 users and agents and beyond. And they are very happy. And so I think that will happen more and more over time. But as you know, it is an enterprise, and an enterprise does not do things instantly. And so that will be a transition that will happen over the next many years.
Trevor Walsh: Great. Thanks all. Appreciate it.
Operator: Ladies and gentlemen, due to the interest of time, our last question will come from the line of Michael Romanelli with Mizuho. Your line is open.
Michael Romanelli: Yeah. Hey, guys. Thanks for squeezing me in here. So, Sanjay, you touched on this in, you know, prior response, but how does your sales capacity today compare to where you were a year ago, both in total as well as in the number of ramp reps? And then separately, I guess, how would you characterize or assess your pipeline, you know, as we head into fiscal 2027? Thanks.
Sanjay Beri: Yeah. It is a great question. So for us, we obviously started ramping, hiring more reps really full force last year. And you can see that in, sort of, the S&M spend as well as it ramped early midyear last year. And it takes about 12 months for us to ramp those reps. And so if you look at that, it is really, for us, in the second half of the year when a lot of those reps will be fully ramped. And, by the way, for fully ramped—as you know, when you get a rep, we do not throw them into a place and give them a bunch of existing accounts. They are hunting new greenfield accounts. So they get on, they start hunting those accounts. They build their pipeline. They get the POC, do the MSA. That is why you have those ramp times, to be clear. And then we are continuing to hire. And so we are building that rep funnel for next year as well. And so, really, that is the best way for you to think about it—bunch of those fully ramped reps coming online in the second half of the year.
Michael Romanelli: Beautiful. Okay. Great.
Operator: Thank you. I would now like to turn the call back over to Michelle for closing remarks.
Michelle Spolver: Thank you, Towanda, and thank you all for joining us today and also staying a few minutes over. Look forward to engaging with you in the weeks and months ahead, including at RSA this month, where we will be sharing more about our AI strategy as well as demonstrating our newly announced AI products. Thank you all. Have a good evening.
Operator: Ladies and gentlemen, that concludes today's conference call. Thank you for your participation. You may now disconnect.