I tried it and the open rates for each of the three titles were fairly close. I think I'm pretty good at writing headlines so maybe that was the reason (although I've written a few stupid ones!). Or perhaps they were too similar to see a difference. I don't think I will use it again.
Similar results, and I think unless the statistics are more meaningful, there is potential harm done by sending some readers different titles. Another tool that matters more for really big Substacks, where a 2% improvement might mean real $. It’s only useful if you really suck at headlines or have been doing something profoundly wrong before.
I'm thinking there are bigger issues with this A/B test stuff because I sent one this morning. First of all, the differences are minuscule, like 1-2% between three different headlines. SEcond, only 958 people out of 4000+ opened. This screams of deliverability issues. Even my lowest of low newsletters has never been that low.
I tried it and it’s super buggy and doesn’t allow you to schedule a post after the test! But publishes it immediately. And for mine it published this “Truth a” when my title was “The Ugly Truth About Rejection” and sent it out a day early! I fear even trying it again.
I tried it, and the post was sent but doesn’t appear on my dashboard of past posts. I can get to it from the email version I send myself…but where is it?
I have not tried it, but have tried Monster Insights and sometimes it has helped me make a decision, but other times it leans so quickly to clickbait that I've just said no, this is not me.
Yep true, Joseph. I meant it would be rare for a creator to resend posts they'd already sent before, rather than just unlocking them and sharing a link.
I tried it and the open rates for each of the three titles were fairly close. I think I'm pretty good at writing headlines so maybe that was the reason (although I've written a few stupid ones!). Or perhaps they were too similar to see a difference. I don't think I will use it again.
"Press the Shiny Button." No question. 😊
Similar results, and I think unless the statistics are more meaningful, there is potential harm done by sending some readers different titles. Another tool that matters more for really big Substacks, where a 2% improvement might mean real $. It’s only useful if you really suck at headlines or have been doing something profoundly wrong before.
I'm thinking there are bigger issues with this A/B test stuff because I sent one this morning. First of all, the differences are minuscule, like 1-2% between three different headlines. SEcond, only 958 people out of 4000+ opened. This screams of deliverability issues. Even my lowest of low newsletters has never been that low.
this is what I'd be afraid of....
I tried it and it’s super buggy and doesn’t allow you to schedule a post after the test! But publishes it immediately. And for mine it published this “Truth a” when my title was “The Ugly Truth About Rejection” and sent it out a day early! I fear even trying it again.
I tried this and had the same issue as Larissa when I edited a scheduled post. Very annoying! Hopefully Substack fixes this glitch.
I tried it, and the post was sent but doesn’t appear on my dashboard of past posts. I can get to it from the email version I send myself…but where is it?
Oh Mary, I don't know where it is. That's concerning!
I love your posts!
https://kkschiller.substack.com
I have not tried it, but have tried Monster Insights and sometimes it has helped me make a decision, but other times it leans so quickly to clickbait that I've just said no, this is not me.
Thx for adding my reminder and I look forward to trying again!! (User error, sort of, kind of.)
I don’t know about rare circumstances - it’s a feature built into the basic publishing UI!
Thanks for the mention 😄
Yep true, Joseph. I meant it would be rare for a creator to resend posts they'd already sent before, rather than just unlocking them and sharing a link.