Colors of InfoSec Podcast

Episode 6 - Security, Privacy & A Whole lot of *** with special guest, @techwithtaz

Season 1 Episode 6

Episode 6 - Security, Privacy, and a lot of ** with Tazin Khan Norelius.

In episode 6, we're chatting with our first guest to the show, cybersecurity specialist, empath, and critical thinker Taz (@techwithtaz). With over 10 years in the security space, Taz works to bring mindfulness into Cyber Security while making noise around the lack of consumer data ownership and privacy. We talk about:

  • The importance of consumer awareness and education relative to security, data privacy & tech ethics.
  • Why tech companies don't really want everyone's opinion & why capitalism wins. 
  • Empowering consumers through education in an accessible, approachable way and at scale through her research organization Cyber Collective and her weekly video series security, privacy, and a lot of other bullsh*t
  • Amplifying underrepresented voices.
  • Why you should always ask questions vs unconsciously clicking and sharing. 
  • And why she recommends a consistent digital/social media detox's every weekend.  

(This episode contains some expletive language)

Resources

  1. Learn more about Taz @ techwithtaz.com // Follow her on Twitter @techwithtaz
  2. Learn more about her org Cyber Collective @ https://www.cybercollective.org/ & on Instagram @cybercollectiveorg
  3. Find out about upcoming CyCo events at https://www.cybercollective.org/events


Follow us on Twitter

Music

  • Track: Too much ice
  • Artist: Young Kartz via freemusicarchive.org
Speaker 1:

Welcome to the colors of InfoSec podcast, a podcast and mystifying, what it means to navigate a career in information, security and technology as people of color. I'm your host, Asafa Mott and I am Christina Merino, and we're here to give you an all access pass into tech and Infosec's past present and future. We're celebrating cyber awareness month and we wanted to kick things off. Right? So today we are welcoming tasks. Nice to meet you tasks. How are you doing today?

Speaker 2:

I'm good. Thank you guys for having me. I'm so excited. Thank you. Well, first and foremost, thank you for having me. I love this. Um, I have been in the security industry for the past 10 years now. I got started. I feel like in a really unorthodox way. I was broke and I was working at Michael Kors and a woman walked in and, um, she looked fire and I just asked her what she did and how I could do it. And she told me she was in cybersecurity. So I bothered her for a few months until she hired me. And then she finally hired me. And that was really where my start came into, um, uh, cyber security. And that goes into like my journey a little bit. I started in sales and business development and in order to sell, I had to learn product and I sat on a floor full of engineers. And so the technicality of security started to intrigue me a lot. Um, fast forward, brought me into kind of technical consulting and doing security architecture and product mapping. Um, but through this entire journey and process, I felt really lonely, um, in the industry. And that's really what pushed me to start and found or find on how to set up, but found cyber collective, um, my organization. And, um, it's a cybersecurity and privacy research organization. That's rooted in hosting events to educate consumers and the regular public, um, about cybersecurity and privacy and why they should care. And then we take their direct feedback, um, and leverage it to come up with articles, briefings, and policy even, um, or advocate for policy changes. So that's where I am today. There was a little, it was a loaded question. So I wasn't sure you know where to start, but you know, as I've mentioned that he's seen you on Instagram and I know that you are, um, you recently launched season two of your security, privacy, and a lot of other bullshit. Was it like a video show, like a talk show? Like what do we, what are we calling that these days? Like a webcast web series? I don't know. Um, I don't know what you, I just call it, I called it a series and theories, theories and high key. I called the season two because I had a break and it wasn't consistent. I was like, alright,

Speaker 3:

I'm going to drop that one down. I'm gonna drop out.

Speaker 2:

Hey, I'm sure you guys already know making content. It's exhausting. Um, but the show is so, you know, like I was saying about the loneliness factor, like, yeah, I started cyber collective, but then also like all of my friends and my peers, I just started making friends with cool people like Christina, you know, in the security industry. Like I've been lonely as shit out here because it's been, it's just been, not people that are like me, um, that I have found until very recently. So I also have just trying to create a, I guess, a place where I fit and I feel comfortable and I felt comfortable making content if that makes sense, because I could bring my personality and talk about things the way I want to talk about it, because I felt like I had to just code switch and I can't explain things the way I want to explain it in certain business meetings and whatnot, you know, so that's to show, um, because I was like, well, if they're over here talking in a way that makes me not even want to continue a conversation, then clearly people aren't going to want to listen to that shit. You know what I mean? Um, and all of my friends and my peers are like creatives. My brother's a rapper. Uh, and so many of my friends are just in the entertainment industry even. And so I was like, there's something like, there's a way to bridge this gap and I want to do it because I think I can. And so that's really like what set it off for me. Um, but it's a show where I just talk about security, privacy and a lot of other bullshit.

Speaker 3:

Well, I'm like a consumer perspective, right? Like, is it more geared towards consumers and small medium businesses? Enterprises

Speaker 2:

Definitely, definitely geared to consumers, definitely geared to an audience that is a more lighthearted and can handle a bit of harshness and bluntness that I bring. Um, this season, I guess I specifically talk a lot about election security and, um, misinformation, but the purpose of my show is less around like how tos or telling Google, like this is how you can become more secure and focus more around philosophy, sizing security and data privacy, and to understanding how there are a lot of dots that aren't necessarily connected. Right. Um, my, the entire premise of my show is I want to bring people information so that you can start questioning things the way I question things. Right. Um, so it's definitely more existential. If that makes sense, then it is very much like this is a show about cybersecurity and privacy policies. I touch on topics that maybe are considered a bit more mainstream, right. And try to get people to understand like, Hey, this is why this matters, and this is how you can break it down so that you understand why it affects you. Cause I don't think people understand why a lot of things that are happening in industry affect them as individuals. Um, so definitely definitely consumer focused, uh, 150%, I have made videos in the past for small businesses and whatnot. But I think that if you educate individuals, you will in turn educate all small businesses because individuals are yeah. It's exactly.

Speaker 1:

Yeah. I think, I think that's an interesting take because I saw the episode you did on drones and I thought it was, I thought it was dope because like, that's something I could let my mom see. I can let some of my relatives see that they don't understand kind of like what we do in InfoSec. Right. Or they don't understand privacy, the importance of it and things of that nature. So your video, I thought your series, I think it just speaks to everybody. Like you could speak to like millennials, they can speak to older people. So I, I thought that the audience would really like this, the way you're putting it out is just something that we need in this field. And, um, I thought it was sick. I, I was really feeling it to be honest with you. And I thought you did a good job and just the production, the way it's set up is just, it's just, I like it. Um, also, I, I also mentioned, you mentioned like cyber collective a little bit. Right. And that you guys are, are like gathering research and things. We could talk a little bit more about that in terms of like, what, what was the mindset behind?

Speaker 2:

Yeah, absolutely. So there were, as any, I feel like business, you go through iterations. And the one thing that I started with was to Dean, this shit is going to change. So just make sure you set up your business in a way that, or whatever you were doing, um, knowing that you might pivot or you might change and shift. And I think that, um, you know, cyber cleft collective originally started as a cybersecurity services organization for small businesses, medium sized businesses and nonprofits. That's where, like, when I started this a year and a half ago, I was like, yeah, I'm going to do this. And I'm going to speak to people in a language that makes sense XYZ a year and a half later. And I think that, you know, the way that this turn took was based off of a lot of just advocacy work and a lot of like pro bono work I started doing, um, in the last year and a half from like an architecture perspective. And I realized how much even us in the security industry, like from a, I mean, individual consumer security perspective, and then also just data privacy and ethics in general that we're really not necessarily like, not everyone's privy to, you know, just because you're in cyber security doesn't mean, you know, shit about privacy. Like, yes, there's compliance, that's being pushed through, you know what I'm saying with GDPR, with CCPA. But if you ask security professionals that aren't in charge of compliance and risk in their organization, you know, like organizations that are large and have like an entire risk team, the InfoSec group, doesn't always know what's going on from a policy perspective, right. And vice versa from a technical perspective for people that work in policy. Um, but as far as, and I just, I noticed that gap, which was very real for me, but specifically as, as why we like the goal of research and why we turn it into a research organization was because of the feedback I was getting from people with the videos I was making, I was, I've been getting like just messages and messages and messages of incredible impact, incredibly like impactful feedback from consumers on their ideas, people from all different types of industries, they're like, yo, you broke this information down in a way that I finally understand. And now I have thoughts and opinions. And so I thought in my head, I'm like, wow, these are incredible insights that I think need to be leveraged to help build fucking policy. Right? And we're over here as security and privacy professionals, developing policy, but are we getting voices from undocumented immigrants? Are we getting feedback like on surveillance technology and how we feel about surveillance, technology being implemented? Are we asking marginalized communities what their perspective is on how their data is being affected and what would their feedback beyond these ethics? Like how, and right now, almost every single ethics organization, I don't want to be hyperbolic. There are many, many ethics organizations that probably have more representation, but it's mostly white. It's a white dominant field. And so our events are successfully attracting global audiences right now, which is, I was shocked when I saw that not be the case. Like really I was so shocked and we wanted to create a space where we like, it's just the, the formula. Like we're going to teach people what this what's happening, tell you exactly what the policy means so that you understand. And then we want to know how you feel about it because his ethics research really rooted in emotional intelligence

Speaker 3:

And how it's affecting people from an emotional perspective, all of the data is done. Um, more quantitatively, right? And so this is where using grounded theory research methods, um, to go about our research at this time, we will most likely expand on that, but, um, you know, kind of long winded answers still, but there's just so much, and I'm so passionate about it. It's funny that you mentioned about the like disconnect between the security professionals and just in privacy in general, you can have security without privacy, but you can't have privacy without security. I think, I mean, I look at it from a product perspective as well, you know, having kind of visibility into how these big tech firms build products and sell software and stuff like that. And the fact of the matter is that they don't want everybody's opinions. Right? Exactly. Cause like if you write right, and like, if you, if I get opinions from documented undocumented immigrants or from any other population, it's going to be a hell, no, I don't want you to take my data. Hell no, I don't want you to, you know, do X, Y, and Z. Right. I mean, and I think that's, that's a problem, right? It's definitely a, it's definitely a huge problem, but it, you know, at the end of the day, I think capitalism wins. So I really like the fact that you are empowering through education, right? Like you're, you're helping, you're feeding that data, that information that people are craving like desperately, um, in an accessible, approachable way to get them to like pose these questions. Like you're giving them that, that food, that energy, like you're giving them that red bull. So then if we wait a minute, hold on a second. If I sign up for this account and I accept this privacy policy, what am I really doing? Because Taz said on her video that, you know, I ain't getting paid for this, you know what I'm saying? So like, I really loved that. I think that it's, um, I think we need more of that and it, I feel like you're an activist in that regard, right? Like you're, you're pushing those boundaries and, um, really challenging the status quo for like the betterment of humanity. Right. Thanks. I appreciate that.

Speaker 1:

I agree. I think, I think a lot of the, um, these unrepresented populations, they do need a voice in America, as you can see just with our leadership now. Um, they're, they're trying to silence that voice. So the fact that you're trying to amplify it and to try to give them, you know, a say in a matter like privacy is huge. So a poodle,

Speaker 3:

Thank you. It's been, you know, and the thing is like, and I've learned last week we had an event on election on what is on our ballots. Right. And we went and dove into proposition 24. That is, is regarding data privacy on California's ballot. We went through, um, what's on Michigan's ballot and what's a Massachusetts ballot. And through that PR like we, that event idea

Speaker 2:

Because somebody, the week before sent me a message asking me a question like, Hey, there was a question on privacy, on California's ballot. And I didn't know what to say. And I, that wouldn't have even been a thought in my brain. Like I, that was something that someone else brought to me. Right. And that's the power of notifying the public and talking about these things and talking him about it in a way that like is entertaining and people can actually like want to sit through and listen to. And it's also like with these voices, something that we're trying to answer right now, we had a research and development call right before I got on this podcast. And we're really trying to dig into like, what is the best way do we start developing? Do we partner with groups to develop, bring the voices to those groups? Right. If that makes sense, like we act as a middleman and say, work alongside policy developers to add whatever feedback that the public wants and take questions like in survey, form, back and forth or Doobie, we're trying to figure out what's the best method right now to have marginalized voices apart of the tech ethics dialogue, right? How do we get them involved in ethics? Like I saw something and we're doing an event about it in a few weeks, but about a Zuckerberg and Facebook helping develop an internet ethics policy. I saw that and I was like, what hypocritic bullshit is this, there is no way. Right? And it's the people with the money are the ones that are making the ethics, but the people have power, right? And the power is if you mobilize people enough, you can, if you get people to stop using Facebook because they are, you know, they know what's, what's actually happening. I know that's a very radical thought that won't ever happen, but that's the point of cyber collective. It's like, we're not getting rid of it. The tech giants aren't going anywhere, social media isn't going anywhere. And we have to live with it. It's ingrained in every single part of our lives. Now our work, our livelihood, the way we make money, even. So not like if you can't beat them, join them type deal, but we joined you. So now let's make you better because we know that this is just what it is. And so let's find a solution instead. So that's really the goal of like this kind of think tank oriented research group, right. That cyber collective has. Um, and it's consumer focused and consumer direct, which I, and, and the, the groups that we attract. I mean, like, I've never been to, I'm not trying to toot my own horn or, but like, I've never been to any event as diverse as cyber collective events ever in my 10 years in information security and cybersecurity, like ever. And I think it's, that's the most powerful part of all of this because I, every meeting and most of my meetings, I'm the only person of color in those calls.

Speaker 3:

You know, what's interesting that like folks of color can, like, we can mobilize people. We can find people whether they're white of color or whatever, but then like our current funds are these businesses.

Speaker 2:

Yeah.

Speaker 3:

Then you get an all white panel. Right. And it's like, Oh, we couldn't find, we don't have enough. You know, there's no pipeline or whatever. And it's like, are you serious? Right. And you're like, you didn't even try. That's always interesting to me, should consumers also hold these tech companies accountable because I feel like we forget, right. You just said something, you said, you know, people will probably never stop using Facebook. Right. Um, and I'm sure you've watched the social dilemma, like a trillion times as well. And you know, there's like a science behind that and like that whole addiction. Right. So I just feel like, even at times, like, do you see that even educating consumers and, and sharing this data, like, do you notice impact, do you see actual change? Do you see people delete their Facebook accounts? Cause I fear that even though people know, like now you have the education, you've given them the education, but like you, you would assume that people know better and do better, but people don't do better even though they know better. Right. Because of that, not always. Right.

Speaker 2:

Not always. I think that it depends on the demographic of people. I think that, and that's where this like emotional intelligence awareness and almost like the, the aspect of mental health, like it has such a huge role in what we do. And that's a part of like my series and just like the whole, you know, Christina, you just said you were on a panel about digital mindfulness today. Right? So it's that element about digital mindfulness has a huge role in all of this. And I have seen significant behavioral differences every Friday. I delete all of my social media off of my phone, off of my, like, I don't go on Facebook. I didn't do it last weekend because I leverage Instagram for work at times. And, but I I've been doing this for the past, like two and a half, three months now where every Friday I delete like Instagram and Twitter, um, off of my cell phone and I don't go on it at all and I've been talking about it and I say like, all right, y'all Fridays, I'm out. And then I come back on Monday and I talk about like all this shit that I did over the weekend, because I didn't have my phone in my hand. And it's amazing. Like I love this practice so much. And there are so far hundreds of people that have done this that has started to do this. And you know, a lot of folks I'm like, yo, make it your, and the thing is, people are like, um, they're like, yeah, I wanted to share this. And, and this is a whole separate conversation. I'm like making you make it your movement. Like, it doesn't have to be like, you know, Taz was doing this. And I, now it doesn't have to be like, some people try to like put ownership onto things that don't own that concept. Do you know what I mean? Like I have been encouraging everybody, like, make it your own, make it your movement for your algorithmic bubble to sign off for a few days. And Christina, let me tell you, people have been doing it. And I have seen the beautiful, beautiful things that they come back with, everything that they did over the weekend. Like even things as simple as like a bubble bath and, or, and, um, men talking about taking bubble baths to, because it's not happening. Right. But, um, just like you people doing things with their hands, spending time with their family more. So I think that when, if we can really like, just normalize the way we talk about stuff, it is what the difference is, right? It's like when we connect at a human level and we say like, Hey, this is, this is our law, our life now. So let's, it's, it's just balanced on so many different levels. And it's the same way with like the ethics conversation almost. It's like, okay, we know that consumers can't drive develop like policy development. We need experts 100%. We need experts, but we can have like middlemen experts that help rally a lot of voices. And, you know, conception like articulate all of those thoughts into and formulate it into a way that experts can digest. And then we can help bridge these gaps. And I think that we can do that. Like if we can all come on that level, but the reason it doesn't always happen is because I just see so much fucking ego in our industry. It pains me. It pains me so deeply the amount of ego that lives in the cybersecurity industry when you actually work with like technical people on certain teams and it's got to go, otherwise, we're not going to be able to make any change, but a lot of people aren't here for change. So, and I, I respect that, you know, not everybody is a change maker. Um, and you can't expect everybody to be either.

Speaker 3:

Yeah, that's a good point. I think that that's why social media has been so successful or at these, one of the reasons is that people seek this, this concept of connection or belonging. Like I see a lot of like the similarities, you know, across like the tech industry and people that work in a technology capacity or an information security or cyber capacity, uh, there is this like almost wanting to be almost like unicorn status and special. And, and, and because of that, you know, it's kind of seeped into pop culture. And just like, now it's like, you know, we celebrate these, we celebrate these like Infosec's celebrities and these people, and it's just, it just becomes this whole thing. So I think that it's feeding that ego monster more. And, um, yeah, I mean, I agree with you 100%. It's a, it's really sad to see. And I think that because of it, a lot of, a lot of folks are searching for other opportunities or other careers, uh, because the, the culture and the community, you know, has its good parts. Don't get me wrong. There are a lot of amazing people in the space, but then there's also like that ugly side, right? Big believer in empowerment through education. It can be overwhelming. I do every bit that I can. Um, I'm actually starting with my children, right? Like I'm feeding, you know, I'm teaching my daughters about, uh, uh, proper like security hygiene and password hygiene and why. Right? Like the why's are so much more important than like the how. And so I think that it's important that we begin educating children at Yale at a young age, because look, I'm a little bit older and I'm not going to say my age, but what I will tell you is my time. It was like the AOL time. So I remember when there was no internet that was easily accessible. Right. I remember the 14 came ODMs and I also remember beepers. So I grew up in a time where I didn't have the internet. I didn't have a smartphone in my pocket, so I totally understand what it means to disconnect and like enjoy earth and life. But I feel like the generation that was born in the late eighties and nineties, they've only known a world like a hyperconnected world. Right. Which makes it even a little bit more complex.

Speaker 2:

I'm even I'm on the younger side. Right. But even me, I didn't, and maybe it's because like I grew up, um, on like a lower socioeconomic end. Right. But I didn't have a cell phone until I was able to purchase my own in my twenties, you know, and we didn't have a computer until I was in high school. And for me even like being a millennial, that that was also very out of touch. Um, and we, like, we had a compact 1993 computer, you know, like the old G dialogue. Um, I, and our narrative is more like an immigrant narrative, um, in a sense, and I'm sure many Americans non-immigrants can relate to this too. It's just, we just didn't have money for it. And so like, I didn't have media until later in my life. So I also remember like the, the time, like, I feel like millennials definitely grew up with it's that whole element of, of knowing that there was a time without it. And, and then we are between these two generations in a sentence that is completely out of touch. And then one that's being like they're growing up entirely in it. Yeah. I think, I think

Speaker 1:

It's an addiction and I'm, I'm with you. I, I was, I didn't have my cell phone until I was like 19. I had a beeper actually, which is kind of crazy, but the funny thing is that moving forward. Right. And like not only breaking the addiction, but do you think like the federal government should have any responsibility in any of this? Like do you think they should try to legislate, do you think, um, they're equipped the legislature and I, well, I feel you

Speaker 2:

IQ in order to be

Speaker 1:

And 15% of the password too. Right.

Speaker 2:

I think that there are a couple of different things. I mean, um, I feel like cyber collective is a lot of my answer to that and what we're trying to develop as far as the government. Absolutely. The fuck. Not, um, no, no, no. I don't want the feds involved in any type of regulation around anything that I use. Like, uh, even though many security, like Tech-Talk right. The whole element about whether it's secure, like from a privacy perspective, the fact that they're collecting the amount of information and, you know, all social media platforms are pretty much collecting the same amount of data. Um, it's just the fact that China as a foreign entity was making money off of that data. That's the only reason why they're there is so much, um, drama around take talk it's it has, they're making it seem like it's about like consumer data privacy, but they don't give a fuck about consumer data privacy. It's about the fact that we're as a society, as America, not building capital off of our own citizens and how can we let China make money off of our citizens? We need to be making money off of our citizens. So the government, I mean, I think that so far, I just systemically like over the course of history, I don't trust the government to put together anything. Um, you know, we, we are currently trying to dismantle the process that currently exists. You think of surveillance technology and what the government is willing to implement and how it's specifically criminalizing black people. Right. Um, even with the recognition tools that are being leveraged, Clearview AI and its direct contract with, um, uh, ice. And there are so many different private entities from a cer like technology, like technological perspective that our TA that are making deals with the federal government, because of course they want more surveillance. Right. And, and there are a lot of different conversations that are happening. So you think about like the surveillance that the government is implementing through private entities, and then you think about the surveillance at big tech has that the government doesn't have access to. So there are a couple of different conversations that like tie into what you asked. There's the antitrust committee that exists already. That's been, uh, you know, banging into, uh, the big four tech giants about just taking monopolizing, the industry and what that means. Um, and then there's also the conversation that needs to happen with our government about their tactics, right? So as far as committees are concerned and whatnot, I think that it's hard to, um, implement diverse committees in spaces that diversity is just so nonexistent and, and our government is an answer to that. So I think that we're going to need to create more private entities or private organizations that drive that legislation and policy forward. Like we just got to create like anti organizations to kind of, um, dismantle what they're creating, because I think it will be years before we can get the right people in place to, um, vote out that we have and people growing up to implement a more diverse Congress, which I think will happen eventually. But at this, in the space that it is right now, um, I don't think that's possible, but it's just, it's a really loaded question.

Speaker 1:

Perfect segue. Because, um, you know, you talked about getting a more diverse Congress, so we've got the election coming up, right? So actually people are already voting. What are your thoughts on like the election security

Speaker 2:

Until these large organizations, these enterprises, there are regulations around what they can and can't do, but there, there has to be consequences for these organizations. Um, I think that we will continue to see a lot of this propaganda. Like that's not going, going to change. And there are so many levels to the way you can talk about election security. I think for me, the conversation that I focused on is more around the misinformation, right? And the propaganda and the use of our, you know, social media as a system of mass surveillance and mass manipulation, right. There are military documents about how memes are leveraged for military operations. Um, so it's PSYOPs, it's psychological warfare. And I have a more radical perspective on this than a lot of people that may be listening. No, I don't think that there's anything that we're going to, not that there isn't anything that we're going to be able to do about it, but there's so much power that exists. That's beyond the small conversations that we're having. Like, it would be foolish of me to say like, yeah, we're going to be able to make this change. We can do small things. And I think the way that we do make changes, making consumers aware, because people have the power, we are the masses. Right. Um, and that's the biggest gap. It's all the Karens. Aren't angry about their data being manipulated. So we got to get the right people met, you know what I'm saying? And then that's where the change will happen because that's

Speaker 3:

Just the way that, that's how I told her is I ever awareness month. Right. We're trying to educate the masses. You know, you're doing a phenomenal job trying to get people woke or Walker. Right. Um, educating people on the disinformation, propaganda, all of that. What are a couple of takeaways that you would give to our audience on these are the things that you need to look out for. This is what you should do at a baseline level, like tomorrow, like do this at a minimum and then work your way up. Like, don't get overwhelmed. I need suggestions.

Speaker 2:

I tend to give more like existential suggestions, like I said earlier. Um, and our conversation just around my feedback is around like digital mindfulness in a sense. So something that I tried to let people take away after the content that I build is, you know, my biggest, biggest, biggest advices always ask questions, you know, question everything. Don't just take anything for what it is for Bible. Um, our screens like data is data is manipulated. That's what data is, it's manipulated information and like, actually, right. Um, and so no that what's reaching your screens. Isn't always what you think it is. And consciously click that's, um, what you click on can lead to, um, good things and bad things happening. And sometimes if you let yourself just be quick to click on something, you can end up having a lot of, um, malicious activity on your device. Right. And then, um, lastly, consciously share, uh, which is really important to me because I think the biggest thing that we can do is slow down and stop. Just being quick to like tap, tap, tap, and just share some main because we retreat. Yeah. Just because we saw it, like read the whole article. I just tweeted about this, watch the entire video, my stats on my last video. Let me tell you, there were like, like a lot hundreds and hundreds of shares, but only 4% of my video was watched. I was like, all of y'all are trash because you only watching 4% of something and sharing with such insightful feedback like that is, I literally told like my followers right on, on Instagram, just like, why all doing this shit that I'm telling you not to do? Like, I don't care if it's nonsense or anyone else's like stop being a part of the problem is the entire is the element. Like it's just slowing down and being mindful. And if we don't let our ego get in the way we can do it, it's just, we think that we need, we need at all times

Speaker 4:

More folks like our listeners want to find out more about, you want to find out more about cyber collective when to find out more about your show. Like what can they find you in?

Speaker 2:

Yeah, we are all on, um, most social platforms. I would say we're on a LinkedIn, Twitter and Instagram. Um, you can find my handle everywhere is at tech with Taz on Instagram and on Twitter and then cyber collective. Um, I wish we had the same handle on all social platforms, but we don't, we didn't have other people caught on to the name. I guess they don't even use the page, which side, but on Instagram, it's at cyber collective org and on Twitter it's get psycho, but I can also share it with you all. But our website is simply www.cybercollective.org, and you'll find links to all of our social handles there. And also our event information, which right now it's every Thursday at 6:00 PM EST, um, is what we're calling psycho hour. So thank you so much for joining us. We really enjoy having a cool conversation with you. I'm definitely attending your next event. We will chat soon. Thanks for joining us. Thank you. If you enjoyed this episode, don't forget to rate, subscribe and share. You can find us on Apple podcasts, Spotify, Google podcast, among others. Follow us on Twitter or Instagram. Thanks for listening.