Monthly Archiv: July, 2019

PHP Mobile Phone Detection

Package:
PHP Mobile Phone Detection
Summary:
Detect when a site use is using a mobile phone
Groups:
HTTP, PHP 5, Wireless and Mobile
Author:
Pierre-Henry Soria
Description:
This class can detect when a site use is using a mobile phone...

Read more at https://www.phpclasses.org/package/11295-PHP-Detect-when-a-site-use-is-using-a-mobile-phone.html#2019-07-18-09:13:33

Freebie: Transport Icon Set (50 Icons, SVG & PNG)

This week we’ve another fantastic free icon set. Designed by our friends over at Freepik, we have a free transport icon set for you!

All 50 of the free transport icons are 100% customizable, all colors are editable, and come in both SVG and PNG formats. The set includes icons for cars, buses, bicycles, trucks, fire engines, ambulances, motorbikes, mopeds, trains, boats, and even hot-air balloons, blimps and spaceships! You are, of course, free to use this icon set in both your personal and commercial projects.

Preview and download the entire collection below…

Free Transport Icon Pack Preview

free transport icon pack preview png svg free freebie preview

Download & License

Download the Free Transport Icon Pack

You are free to use this free icon pack in both your personal and commercial projects.

The post Freebie: Transport Icon Set (50 Icons, SVG & PNG) appeared first on Speckyboy Design Magazine.

Jason McCreary: Lowering the time cost of testing existing Laravel applications

In my guide to start testing your Laravel applications I mentioned two barriers to testing: time and skill. In this post, I want to address time. ⏱ I'm someone who values time over anything else. From my perspective, time is the only thing I'll never have more of. It annoys me when something ...

What is Robots.txt and How Does Implementing Robots.txt Affect SEO?

Is robots.txt the straw that’s breaking your SEO camel’s back?

Search engine optimization (SEO) includes big and small website changes. The robots.txt file may seem like a minor, technical SEO element, but it can greatly impact your site’s visibility and rankings.

With robots.txt explained, you can see the importance of this file to your site’s functionality and structure. Keep reading to find out robots.txt best practices for improving your rankings in the search engine results page (SERP).

Want effective full-service SEO strategies from a leading agency? WebFX has robust services and a team of 150+ adding expertise to your campaign. Contact us online or call us at 888-601-5359 now.

What is a robots.txt file?

A robots.txt file is a directive that tells search engine robots or crawlers how to proceed through a site. In the crawling and indexing processes, directives act as orders to guide search engine bots, like Googlebot, to the right pages.

Robots.txt files are also categorized as plain text files, and they live in the root directory of sites. If your domain is “www.robotsrock.com,” the robots.txt is at “www.robotsrock.com/robots.txt.”

Robots.txt files have two primary functions for bots:

  • Disallow (block) from crawling a URL path. However, the robots.txt file isn’t the same as noindex meta directives, which keep pages from getting indexed.
  • Allow crawling through a certain page or subfolder if its parent has been disallowed.

Robots.txt are more like suggestions rather than unbreakable rules for bots — and your pages can still end up indexed and in the search results for select keywords. Mainly, the files control the strain on your server and manage the frequency and depth of crawling.

The file designates user-agents, which either apply to a specific search engine bot or extend the order to all bots. For example, if you want just Google to consistently crawl pages instead of Bing, you can send them a directive as the user-agent.

Website developers or owners can prevent bots from crawling certain pages or sections of a site with robots.txt.

Why use robots.txt files?

You want Google and its users to easily find pages on your website — that’s the whole point of SEO, right? Well, that’s not necessarily true. You want Google and its users to effortlessly locate the right pages on your site.

Like most sites, you probably have thank you pages that follow conversions or transactions. Do thank you pages qualify as the ideal choices to rank and receive regular crawling? It’s not likely.

It’s also common for staging sites and login pages to be disallowed in the robots.txt file.

Constant crawling of nonessential pages can slow down your server and present other problems that hinder your SEO efforts. Robots.txt is the solution to moderating what bots crawl and when.

One of the reasons robots.txt files help SEO is to process new optimization actions. Their crawling check-ins register when you change your header tags, meta descriptions, and keyword usage — and effective search engine crawlers rank your website according to positive developments as soon as possible.

As you implement your SEO strategy or publish new content, you want search engines to recognize the modifications you’re making and the results to reflect these changes. If you have a slow site crawling rate, the evidence of your improved site can lag.

Robots.txt can make your site tidy and efficient, although they don’t directly push your page higher in the SERPs. They indirectly optimize your site, so it doesn’t incur penalties, sap your crawl budget, slow your server, and plug the wrong pages full of link juice.

4 ways robots.txt files improve SEO

While using robots.txt files doesn’t guarantee top rankings, it does matter for SEO. They’re an integral technical SEO component that lets your site run smoothly and satisfies visitors.

SEO aims to rapidly load your page for users, deliver original content, and boost your highly relevant pages. Robots.txt plays a role in making your site accessible and useful.

Here are four ways you can improve SEO with robots.txt files.

1. Preserve your crawl budget

Search engine bot crawling is valuable, but crawling can overwhelm sites that don’t have the muscle to handle visits from bots and users.

Googlebot sets aside a budgeted portion for each site that fits their desirability and nature. Some sites are larger, others hold immense authority, so they get a bigger allowance from Googlebot.

Google doesn’t clearly define the crawl budget, but they do say the objective is to prioritize what to crawl, when to crawl, and how rigorously to crawl it.

Essentially, the “crawl budget” is the allotted number of pages that Googlebot crawls and indexes on a site within a certain amount of time.

The crawl budget has two driving factors:

  • Crawl rate limit puts a restriction on the crawling behavior of the search engine, so it doesn’t overload your server.
  • Crawl demand, popularity, and freshness determine whether the site needs more or less crawling.

Since you don’t have an unlimited supply of crawling, you can install robots.txt to avert Googlebot from extra pages and point them to the significant ones. This eliminates waste from your crawl budget, and it saves both you and Google from worrying about irrelevant pages.

2. Prevent duplicate content footprints

Search engines tend to frown on duplicate content, although they specifically don’t want manipulative duplicate content. Duplicate content like PDF or printer-friendly versions of your pages doesn’t penalize your site.

However, you don’t need bots to crawl duplicate content pages and display them in the SERPs. Robots.txt is one option for minimizing your available duplicate content for crawling.

There are other methods for informing Google about duplicate content like canonicalization — which is Google’s recommendation — but you can rope off duplicate content with robots.txt files to conserve your crawl budget, too.

3. Pass link equity to the right pages

Equity from internal linking is a special tool to increase your SEO. Your best-performing pages can bump up the credibility of your poor and average pages in Google’s eyes.

However, robots.txt files tell bots to take a hike once they’ve reached a page with the directive. That means they don’t follow the linked pathways or attribute the ranking power from these pages if they obey your order.

Your link juice is powerful, and when you use robots.txt correctly, the link equity passes to the pages you actually want to elevate rather than those that should remain in the background. Only use robots.txt files for pages that don’t need equity from their on-page links.

4. Designate crawling instructions for chosen bots

Even within the same search engine, there are a variety of bots. Google has crawlers apart from the main “Googlebot”, including Googlebot Images, Googlebot Videos, AdsBot, and more.

You can direct crawlers away from files that you don’t want to appear in searches with robots.txt.  For instance, if you want to block files from showing up in Google Images searches, you can put disallow directives on your image files.

In personal directories, robots.txt can deter search engine bots, but remember that this doesn’t protect sensitive and private information though.

Partner with WebFX to make the most of your robots.txt

Robots.txt best practices can add to your SEO strategy and help search engine bots navigate your site. With technical SEO techniques like these, you can hone your website to work at its best and secure top rankings in search results.

WebFX is a top SEO company with a team of 150+ professionals bringing expertise to your campaign. Our SEO services are centered on driving results, and with over 4.6 million leads generated in the last five years, it’s clear we follow through.

Interested in getting the highest quality SEO services for your business? Contact us online or call us at 888-601-5359 now to speak with a qualified team member.

The post What is Robots.txt and How Does Implementing Robots.txt Affect SEO? appeared first on WebFX Blog.

Site News: Blast from the Past – One Year Ago in PHP (07.18.2019)

Here's what was popular in the PHP community one year ago today:

PHP Internals News: Episode 19: Deprecate curly brace syntax

Dividing responsibilities – Part 1

I'm happy to share with you an excerpt of my latest book, which is currently part of Manning's Early Access Program. Take 37% off Object Design Style Guide by entering fccnoback into the discount code box at checkout at manning.com.

Chapter 7: Dividing responsibilities

We've looked at how objects can be used to retrieve information, or perform tasks. The methods for retrieving information are called query methods, the ones that perform tasks are command methods. Service objects may combine both of these responsibilities. For instance, a repository (like the one in Listing 1) could perform the task of saving an entity to the database, and at the same time it would also be capable of retrieving an entity from the database.

Listing 1. The PurchaseOrderRepository can save and retrieve a PurchaseOrder entity.

interface PurchaseOrderRepository
{
    /**
     * @throws CouldNotSavePurchaseOrder
     */
    public function save(PurchaseOrder purchaseOrder): void;

    /**
     * @throws CouldNotFindPurchaseOrder
     */
    public function getById(int purchaseOrderId): PurchaseOrder;
}

Since saving and retrieving an entity are more or less each other's inverse operations, it's only natural to let one object have both responsibilities. However, in most other cases you will find that performing tasks and retrieving information are better off being divided amongst different objects.

Separate write models from read models

As we saw earlier, there are services, and other objects. Some of these other objects can be characterized as Entities, which model a particular domain concept. In doing so, they contain some relevant data, and offer ways to manipulate that data in valid and meaningful ways. Entities can also expose data, allowing clients to retrieve information from them, whether that is exposed internal data (like the date on which an order was placed), or calculated data (like the total amount of the order).

In practice, it turns out that different clients use entities in different ways. Some clients will want to manipulate an entity's data using its command methods, while others just want to retrieve a piece of information from it using its query methods. Nevertheless, all these clients will share the same object, and potentially have access to all the methods, even when they don't need them, or shouldn't even have access to them.

You should never pass an entity that can be modified to a client that isn't allowed to modify it. Even if the client doesn't modify it today, one day it might, and then it will be hard to find out what happened. That's why the first thing you should do to improve the design of an entity, is separate the Write model from the Read model.

We'll find out how to accomplish this by looking at an example of a PurchaseOrder entity (Listing 2). A purchase order represents the fact that a company buys a product from one of its suppliers. Once the product has been received, it's shelved in the company's warehouse. From that moment on the company has this product in stock. We'll use the same example for the remaining part of this chapter and work out different ways to improve it.

Listing 2. The PurchaseOrder entity.

final class PurchaseOrder
{
    private int purchaseOrderId;
    private int productId;
    private int orderedQuantity;
    private bool wasReceived;

    private function __construct()
    {
    }

    public static function place(
        int purchaseOrderId,
        int productId,
        int orderedQuantity
    ): PurchaseOrder {
        /*
         * For brevity, we use primitive type values, while in
         * practice, the use of value objects is recommended.
         */

        purchaseOrder = new self();

        purchaseOrder.productId = productId;
        purchaseOrder.orderedQuantity = orderedQuantity;
        purchaseOrder.wasReceived = false;

        return purchaseOrder;
    }

    public function markAsReceived(): void
    {
        this.wasReceived = true;
    }

    public function purchaseOrderId(): int
    {
        return this.purchaseOrderId;
    }

    public function productId(): int
    {
        return this.productId;
    }

    public function orderedQuantity(): int
    {
        return this.orderedQuantity;
    }

    public function wasReceived(): bool
    {
        return this.wasReceived;
    }
}

In the current implementation,

Truncated by Planet PHP, read more at the original (another 11989 bytes)

Build a Fully-Custom Real Estate Website with HomePress WordPress Theme Sponsored

When it comes to building a website, each industry has its own specific needs. They require features, functionality and data types that reflect their particular niche.

This can be a challenge for web designers, as we often tend to work with one-size-fits-all solutions, such as WordPress themes. That can result in a lot of extra work when it comes to customizing things to fit our projects.

The real estate industry is a prime example. Users expect high-end conveniences, while clients look for features that make their jobs easier. Therefore, a general-purpose WordPress theme just won’t cut it. You need a theme that understands the intricacies of the industry and delivers a top-notch user experience.

HomePress is the WordPress theme that offers a complete solution for building custom real estate websites. Built specifically for the industry, it has everything you need to get up-and-running in no time.

Let’s take a deeper look at how HomePress can help you stand out!

HomePress WordPress Theme home page example.

Advanced Features at Your Fingertips

The keys to a great real estate site are intuitive design and powerful features. Users must be able to easily navigate the site and have tools at their disposal that allow them to find what they’re looking for. On the back end, property listings should be easy to create, manage and customize.

HomePress covers all of these aspects with a customer-first approach. Among its many features, you’ll find:

Powerful Search Functionality

Search is a crucial factor in the success of your real estate website. That’s why HomePress allows for unlimited custom fields, radius and geolocation-based search and “live” autocomplete search suggestions. Users can customize their search based on available criteria to narrow down results.

HomePress search features.

Custom Property Listings and More

HomePress comes bundled with the incredible uListing plugin from StylemixThemes. Built with Vue.js, the plugin makes it easy to build custom property listings via a drag-and-drop UI. Use it to build your inventory grid, list and single property layouts with ease. And, you can wow your visitors with the use of 360° Virtual Tours!

uListing Drag-and-Drop Builder.

User-Friendly Conveniences

Finding the right property often means doing research. Therefore, it’s important to offer convenient features that help users make the right decision. HomePress offers your visitors the ability to save their favorite properties, compare listings and calculate mortgage payments.

Ready to Use Templates

HomePress includes an array of ready to use templates that instantly provide you with a beautiful, fully-responsive look. Choose from 10+ demo themes, 8+ inventory pages and 5+ single property pages. Use them as-is or customize to match your needs.

HomePress home page example layouts.

Integration with Top IDX/MLS Services

Want to display MLS listings on your site? HomePress integrates with the most popular IDX broker plugins on the market, including IDX Broker, iHomeFinder and Realtyna. The theme has been thoroughly tested with each plugin to ensure a conflict-free experience.

Front-End Registration and Listings Management

Users can manage their account directly through the front-end of your site. They can register and save their favorite properties for later reference. In addition, you can allow users to submit their own listings and manage their profile. You can also monetize your site and accept payments via PayPal and Stripe.

Detailed Statistics

To fully measure the success of your property listings, you need data. That’s why HomePress features a page statistics view. It displays audience numbers you can use to gauge interest.

HomePress listing statistics table.

Top-Quality from a Theme Author You Can Trust

A great WordPress theme is more than just looks and features. The best themes have a dedicated team behind them, constantly making improvements and providing terrific support.

StylemixThemes, an Envato Power Elite Author, fully stands behind their work. So, you can rest assured that HomePress was built with the highest standards and best practices. Everything has been optimized for speed and multilanguage support.

You’ll also find that great care has been taken to ensure that HomePress caters to the specific needs of the real estate industry. It was built with realtors in mind and its features reflect extensive research.

HomePress also integrates with a wide array of top plugins, including Contact Form 7, MailChimp for WordPress and more. Users of the Elementor Page Builder will be thrilled to know that HomePress includes 13 exclusive widgets to work in tandem with the plugin.

What’s more, the theme comes with lifetime updates and six months of premium 24/7 support.

HomePress feature information.

Get Your Copy of HomePress!

HomePress is the most complete solution for building custom real estate websites with WordPress. Purchase your copy today and build a site that both users and clients will love.

The post Build a Fully-Custom Real Estate Website with HomePress WordPress Theme <span class="sponsored_text">Sponsored</span> appeared first on Speckyboy Design Magazine.

Powered by Gewgley