Monday, 19 October 2020

Why are Guest Networks Important?

Confession time.  I love port scanning, I scan my own network and I will scan other networks when asked too.  There is so much valuable information to be gathered from scanning.  What I especially love about port scanning is the information you can glean from data that is not necessarily visible right away but you can assume is present based on evidence.

I often preach isolated networks as a baseline security measure for people personally and for their small businesses.  For my own home network I have a what is commonly referred to as a "Guest" network.  Setting up a guest network is a quite simple task to do.  Most home routers have this functionality built in.  I wanted to visualize a guest network for you today!

Here are two images that should cause us to pause.




First image is of a guest network.  I am sending an empty, non-recursive host-discovery packet with the SYN flag set.  This is a basic command that can be run to figure out who is on the network.  As you can see it returned very little, the devices were the router, the local machine and another device.

In the second image I am on my primary network.  For visualization I have a bunch of devices on there.  On my primary network I have my server, my primary desktop, my cell phone etc.  I ran the command from the first image and it returned a ton of info.  For the picture I used a simple ping scan -sP, which is basically saying "Who's there? Great! NEXT HOST!!"

So why should we reflect on these two images.  Well on my network I actually have a lot of devices.  However I don't want people visiting me to be able to access or discover my machines.  I have designed my home lab with specific purposes, many of them are security based and I don't need people muddling around where they shouldn't be.

Now imagine this was your business and you didn't have a guest network.

Someone in your waiting room could potentially discover all your severs, the OS, the versions of services.  They can conduct in-depth recon on assets present on the network.  

Hopefully you are cringing at the thought of this.

The images I have included are of a basic guest network.  However many routers come with additional security features like preventing host discovery, password authentication etc.

Give people what they need not what they want.

Thanks!!

Andrew Campbell

Tuesday, 13 October 2020

Virtual Siege Warfare - Part 1

We have been fighting each other for a very long time.  Curiously the tactics by which we fight each other have not changed much.  The tactics are the same but the battlefield is different.

It is no mystery that Ad Revenue is a big deal.  People's entire livelihood depends on web traffic and people clicking on ads and collecting a small percentage of $ from an advertiser.  

This topic is going to be broken into multiple pieces.  As a I read and research I am learning a lot.  There is too much to slam into one article.

Let's look back to when we were holding spears and firing arrows.  Imagine you are an attacking army and you attempting to take a castle.  As an attacker who has supplies being sent to the front, your military leaders have planned for the long haul.  

Your ultimate goal is to take the castle, or destroy it.

Wouldn't it be nice if we could do damage without actually launching a full on attack?  That is where the beginning of siege warfare comes into play.  As the attacker I will surround the castle and prevent resources from getting to the people inside the castle.

Given some time the people will run out of food, their stash of armaments will be depleted.  Morale will be down, and they will be hungry and tired.  From a tactical standpoint this is perfect, my enemy is weak.

Maybe they will surrender or perhaps now is the perfect time to launch my attack.

Now keep this in mind as we travel to modern times.

You own and operate a virtual shop that sells niche shoes.  You have a competitor who has come into the market who is selling a very similar niche shoe.  They are cutting into your business and your revenue is down.  Both of you rely on sales but also people visiting and clicking on ads.

As an angry shoe monger, I want to stop them from tapping in to my sweet sweet shoe revenue.

Adsense and others use website traffic and combinations of analytics to determine eligibility to participate in their ad programs.  It is possible to destroy that source of revenue.  It can happen by accident, unintentionally breaking something in the terms and conditions, but can be directed at a target maliciously.

How is it done?

Put quite simply if a ton of the "wrong" traffic lands on a site, you run the risk of your ad revenue being discontinued.

Whomever(ad programme) you are working with may determine that you are attempting to commit ad fraud(fake clicks, fake visits etc) to grab more money from advertisers.

This happens a lot.

At this point I feel pretty comfortable with scraping.  I like being able to automate the retrieval of data from the web.  During my studying I have learned some techniques to evade common scraping prevention strategies.

As part one of the Siege discussion I wanted to highlight the technique spoofing user-agents.

What is a user-agent? [4]

"The User-Agent request header is a characteristic string that lets servers and network peers identify the application, operating system, vendor, and/or version of the requesting user agent."

When you spoof a user-agent you are telling the server that you are someone you are not.  

Why does this matter?

It matters because when I make a request of a webserver my HTTP header tells the server who I am, that webserver then decides if it will accept me or not.

Web servers can say "NO."  Maybe you are surfing the web with an outdated browser? maybe you are surfing from a geo-restricted location?  Or maybe you are clearly making scraping requests using python!

*Ominous Music !DUN DUN DUN!

It is common for developers to block headers that contain references to requests used by python bots.  Here is an simple bot(submits a request and retrieves the header information).

and the output


As you can see the user agent clearly states that I am using "python-requests."  If I want to block some bots from accessing my site I can specifically block this.

But I am a crafty scraper and I want to access your site regardless.  So I spoof the user-agent.

Below we see a script that rotates user-agent strings and sends requests to a webserver. (I have intentionally cropped the strings.  If you want the strings just give it a quick google :) )

And then the output


Closer look


My User-Agent has been obfuscated!

From a scraping viewpoint this is extremely valuable. From a malicious actor this also has merit.  If I can pretend to be someone different every time I access my target's website there is a chance that I can confuse the target's visit tracking system.

What are the next steps?

(Future posts are going to include the following)

Well, aspects I want to add too the script:

- build in proxies

- build in country selection that pairs common user-agents with country specific proxy

- build in sessions

- build in mouse movements

- add random timing for sessions

 

Stay tuned for future parts to this discussion!

 Andrew


References

[1] https://www.scrapehero.com/how-to-fake-and-rotate-user-agents-using-python-3/ 

[2] https://empireflippers.com/adsense-account-disabled/

[3] https://en.wikipedia.org/wiki/Web_scraping

[4] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent

 

Monday, 5 October 2020

A Walking Tour of Calgary Internet Exchange Points

 We often talk about our ISPs (Internet Service Providers) with varying degrees of like and contempt.  Your ISP is just the downstream from the source though. 

Imagine Telus(Canadian telecom company) as a gas station where you go to fill up your phone with data when it runs out.  Or maybe that they keep your data fuel running to your house so you can binge Netflix documentaries.

A gas station doesn't make the fuel for the car, it sells it to you.  They buy it from the companies that pull the bitumen out of the ground and refine it into something useful.

However there is a part of the internet process that most people are not aware of.  Internet Exchange Points.  They exist as this hub where ISPs can connect too and provide internet to all their customers.  You could if you want, infrastructure depending, connect directly to the IXP and skip the ISP altogether.  Most people don't have the start up funds to get this kind of connection set up.

So I wanted to take you on a walking tour of the IXPs located in Calgary.  In the picture below you can see the locations of our IXPs. Let's start top left and work our way too the bottom right.

(Full disclosure I am using data gathered from this resource as my primary source)




1. Cybera Suite 2003512 - 33 St NW


Located very close to the UofC and a stones throw from Crowchild trail we have Cybera.  It is at this location that we have the IXP -->YEGIX



2. 1313 10th Ave SW

YYCIX

Peering

Rogers has a peering datacentre here. Rogers DC2.

Across from Community foods on 10ave SW lies our next IXP located in Calgary!



 



 

3. 840 7th Avenue SW

Our next IXP is located in a building right beside the Sandman hotel on 7th ave and 8st.  For those who have ever taken a train downtown know the corner of 7th and 8th st well.  You know the one, Macs used to be there.


 follow this link here to get a general layout of office spaces in this building.

4. 800 Macleod Trail SE

Recognize this building? You should, it is the Calgary Municipal Building.


 

5. 7007 54th St SE

 Just across the street from the Calgary Soccer Centre.  We have our next IXP.




6. 5300 86 Ave SE

Located in the same building as Q9 networks we have our final IXP.  Which as it turns out is just north of a Enmax South office location.




Reference

[1] https://www.internetexchangemap.com

[2] https://yycix.ca/https://www.cloudflare.com/learning/cdn/glossary/internet-exchange-point-ixp/